US20090040385A1 - Methods and systems for controlling video compositing in an interactive entertainment system - Google Patents
Methods and systems for controlling video compositing in an interactive entertainment system Download PDFInfo
- Publication number
- US20090040385A1 US20090040385A1 US12/256,374 US25637408A US2009040385A1 US 20090040385 A1 US20090040385 A1 US 20090040385A1 US 25637408 A US25637408 A US 25637408A US 2009040385 A1 US2009040385 A1 US 2009040385A1
- Authority
- US
- United States
- Prior art keywords
- signal
- video
- image
- scene
- participant
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/74—Circuits for processing colour signals for obtaining special effects
- H04N9/75—Chroma key
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
Definitions
- Preferred embodiments of the invention relate to a system and method for signal compositing, and more particularly, to a system and method for interactive video compositing.
- Interactive entertainment is a popular leisure activity for people across the globe.
- One favorite activity for many is karaoke, which temporarily turns lay persons into “stars” as they sing the lyrics to a favorite song.
- Karaoke machines play the music of a selected song while simultaneously displaying the song lyrics to a user, thus allowing the user to sing along with the background music.
- DVD digital video disk
- movie watching has predominantly been a passive activity. That is, one either travels to a theater or inserts a disk into a DVD player and sits back to watch the movie. Though one may watch the same movie repeatedly, each time the same characters appear and recite the same lines and perform the same actions. As of yet, one has not been able to act out his or her favorite scene while appearing to be in the movie itself in a real-time manner.
- chroma-key technology is used to produce a composite video image, wherein it appears that a target image, such as one captured by a video camera, is superimposed in real time onto a prerecorded image.
- the compositing process is managed by internal circuitry and data files that direct when and how the real-time image is inserted into the prerecorded image.
- an interactive video compositing device includes a mixer, video switcher and control circuitry.
- the mixer utilizes chroma-key technology to remove a chroma element from a real-time video image.
- the mixer combines the modified image with a prerecorded video image to form a composite image, which includes the modified real-time image overlaid on the prerecorded image.
- the video switcher automatically selects either the composite image or the prerecorded image to be output to a display. This selection is managed by the control circuitry and is based on data file information that corresponds to the content of scenes of the prerecorded image.
- the data files may contain information relating to the presence (or absence) of a particular character in the movie scene, thus allowing for the output of the real-time composite image instead of the prerecorded image at appropriate times.
- a system for real-time, chroma-key compositing comprises a first input structured to receive a first video signal; a second input structured to receive a second video signal; a chroma-key mixer coupled to the first input and to the second input, wherein the chroma-key mixer is structured to output a composite signal comprising portions of the first video signal and portions of the second video signal; a data file associated with the first video signal, wherein the data file has at least first reference data and second reference data; and a switcher that selects between outputting the composite signal and outputting the first video signal, wherein the switcher outputs the composite signal at a time identified by the first reference data, and wherein the switcher outputs the first video signal at a time identified by the second reference data.
- a reference data system used for video compositing comprises first reference information associated with a first video signal; second reference information associated with the first video signal; and a video switcher configured to switch between outputting either the first video signal or a composite signal that includes portions of the first video signal and portions of a second video signal, wherein the video switcher outputs the composite signal at a time identified by the first reference information, and wherein the video switcher outputs the first video signal at a time identified by the second reference information.
- a software program comprises code for controlling the superimposing of a real-time image onto a prerecorded image, wherein said code identifies portions of the prerecorded image during which the real-time image is to be superimposed onto the prerecorded image.
- a system for real-time video compositing comprises a means for generating a composite signal by chroma-key mixing a first video signal and a real-time second video signal; and a means for switching between the selection of the first video signal and the selection of the composite signal, wherein the means for switching is controlled according to reference information stored in at least one data file that relates to the first video signal.
- a method for video compositing comprises receiving a first video signal having multiple frames; receiving a second video signal; creating a composite signal by chroma-key mixing the first video signal and the second video signal; and selectively switching, with a video switcher, between outputting the first video signal and outputting the composite signal, wherein said selective switching is performed automatically and is based on a control signal received by the video switcher, wherein the control signal is derived at least in part from prerecorded data that corresponds to frames of the first video signal.
- a method for coordinating the display of multiple signals comprises receiving a first signal; receiving a second signal; forming a composite signal comprising portions of the first signal and portions of the second signal; receiving with a switcher the first signal and the composite signal; receiving reference data that corresponds to portions of the first signal; and switching between displaying the first signal and displaying the composite signal, wherein the timing of said switching is based on information contained in the reference data.
- a method for creating a data file used for controlling the superimposing of video images comprises identifying a portion of a first video image during which a second video image is to be superimposed onto the first video image, said first video image portion having a beginning frame and an ending frame; recording in a data file first information that identifies said beginning frame; and recording in the data file second information that identifies said ending frame.
- FIG. 1 illustrates a real-time video compositing system according to one embodiment of the invention.
- FIG. 2 is a block diagram of one embodiment of a compositor device.
- FIG. 3 illustrates a user interface of one embodiment of a compositor device.
- FIG. 4 is a block diagram of an embodiment of a compositor device that is structured to receive input from two video recorders.
- FIG. 5 is a flow chart depicting one embodiment of an interactive video compositing process.
- FIG. 1 illustrates one embodiment of a real-time video compositing system 10 .
- the compositing system 10 is used to selectively superimpose images. These images may comprise images that can be viewed in real-time, prerecorded images, or a combination thereof.
- the compositing system 10 may superimpose a real-time image onto an image that has been prerecorded, such as a movie.
- the compositing system 10 generally comprises a compositor device 12 , a video source 14 , a video recorder 16 , and a display 18 .
- the compositor device 12 receives inputs signals from the video source 14 and the video recorder 16 and outputs a signal to the display 18 .
- composite is a broad term and is used in its ordinary sense and includes without limitation the superimposing or combining of multiple signals, such as, for example, video and/or audio signals, to form a composite signal.
- composite refers to any device or system, implemented in hardware, software, or firmware, or any combination thereof, that performs in whole or in part a compositing function.
- real time is a broad term and is used in its ordinary sense and includes without limitation a state or period of time during which some event or response takes place.
- a real-time system or application produces a response to a particular stimulus within a certain response time.
- a device processing data in real time may process the data as it is received by the device.
- a real-time signal is one that is capable of being displayed, played back, or processed within a particular time after being received or captured by a particular device or system.
- this particular time is on the order of one millisecond.
- the particular time may be longer than one millisecond.
- the particular time may be on the order of hundreds of milliseconds.
- the particular time may be less than one millisecond.
- the particular time may be on the order of microseconds.
- “real time” refers to events simulated at a speed similar to the speed at which the events would occur in real life.
- the video source 14 includes any device, system or technology used to generate, receive, capture, read, supply or store video data.
- the video source 14 may generate an audiovisual signal that includes a video portion that can be processed to produce a video signal (e.g., to produce a visual image) and an audio portion that can be processed to produce an audio signal (e.g., sound at a level high enough to be heard by a human ear).
- the video source 14 comprises a digital video disk (DVD) player.
- the video source 14 comprises a memory that stores data representing video content.
- the video source 14 may comprise a device that receives a video transmission, such as through a cable network, a satellite dish, an antenna, or a network.
- the video source 14 may comprise a television, a video cassette recorder (VCR), a CD+G player or a digital video recorder.
- the compositing system 10 may include multiple video sources 14 , each being coupled to the compositor device 12 .
- the compositing system 10 may comprise a multiplexer or a switch that selects a signal from one of multiple video sources 14 .
- the compositing system 10 comprises a DVD player that reads data from a DVD and a cable box that receives a video transmission over a coaxial cable line. A two-input multiplexer may then be used to select between a signal from the DVD player and a signal from the cable box.
- the video source 14 may be coupled to the compositor device 12 by any medium that provides for video signal transmission.
- the video source 12 may be coupled to the compositor device 12 through an RCA cable, an S-cable, a coaxial cable, Ethernet, wireless technologies and the like.
- the video source 14 may supply audio content along with video content. This audio content may be delivered on the same or different mediums as the video content.
- FIG. 1 depicts an embodiment of the invention wherein the video source 14 is external to the compositor device 12 .
- the video source 14 may be internal to the compositor device 12 .
- the compositor device 12 may comprise a DVD player or may comprise a memory having stored video data.
- the compositing system 10 may comprise at least one video source 14 that is internal and at least one video source 12 that is external to the compositor device 12 .
- the compositing system 10 includes the video recorder 16 .
- the video recorder 16 comprises any device, system or technology that is capable of converting real-time video images into an electronic signal, such as a digital or an analog signal.
- the video recorder 16 is capable of the real-time conversion of and transmission of video images.
- the video recorder 16 is a video camera.
- the video recorder 16 may comprise a camcorder, such as an analog camcorder or a digital camcorder.
- the video recorder 16 may be coupled to the compositor device 12 through an RCA cable, an S-cable, a coaxial cable, Ethernet, wireless technologies and the like.
- the compositing system 10 may include multiple video recorders 16 , each being coupled to the compositor device 12 .
- multiple video cameras may be coupled to the compositor device 12 .
- the video recorder 12 may be internal or external to the compositor device 12 .
- the compositing system 10 also comprises the display 18 .
- the display 18 receives an output signal from the compositor device 12 and converts the output signal to at least a video image.
- the display 18 comprises a television that is coupled to the compositor device 12 through RCA cables.
- the display 18 may include a video projector, a monitor or the like and may be coupled to the compositor device 12 through any medium that provides for video signal transmission, as has been previously described.
- the display 18 may also be used to provide instructions or data to the user or users of the compositing system 10 .
- menu selections or command prompts may be displayed to the user through the display 18 .
- dialogue prompts such as are used in general karaoke machines may be portrayed on the display 18 so as to assist a user in reciting the appropriate lines.
- the compositing system 10 may comprise multiple displays 18 .
- the display 18 may also be internal or external to the compositor device 12 .
- the compositor device 12 may include a screen that portrays a video image to the user. Such a screen would allow a user to have visual feedback as to the final output of the compositing system 10 without having to look at an external display.
- the compositing system 10 may also comprise a media storage device (not shown) that stores the signal output by the compositor device 12 .
- the compositor system 10 may comprise a memory configured to store in digital form a copy of the output signal that is sent to the display 18 .
- the compositing system 10 may output a signal only to the media storage device instead of the display 18 . In such an embodiment, the output video and audio content could be stored for later playback on another device.
- the media storage device may be included with the display 18 .
- FIG. 2 illustrates a block diagram of one embodiment of the compositor device 12 .
- the compositor device 12 allows a user to selectively overlay images in real time onto a second video image, such as prerecorded video content.
- the compositor device 12 comprises control circuitry 20 , a memory 22 , a DVD player 24 , a multiplexer 26 , a chroma-key mixer 28 , a switcher 30 and a user interface 32 .
- the components of the compositor device 12 are modules that comprise logic embodied in hardware or firmware, or that comprise collection of software instructions written in a programming language, such as, for example C++.
- a software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpretive language such as BASIC.
- software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts.
- Software instructions may be embedded in firmware, such as an EPROM or EEPROM.
- hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors.
- the functions of the compositor device 12 may be implemented in whole or in part by a personal computer or other like device.
- the components of the compositor device 12 need not be integrated into a single box.
- the components can be separated into several subcomponents or can be separated into different devices that reside at different locations and that communicate with each other, such as through a wired or wireless network, or the Internet. Multiple components may be combined into a single component.
- the components described herein may be integrated into a fewer number of modules. One module may also be separated into multiple modules.
- the control circuitry 20 manages the operation of components of the compositor device 12 .
- the control circuitry 20 is a special purpose microprocessor.
- the control circuitry 20 may be implemented as an application-specific integrated circuit (ASIC).
- ASIC application-specific integrated circuit
- the control circuitry 20 may be implemented as one or more modules, which modules may be configured to execute on one or more processors.
- the modules may comprise, but are not limited to, any of the following: hardware or software components such as software object-oriented software components, class components and task components, processes, methods, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, applications, algorithms, techniques, programs, circuitry, data, databases, data structures, tables, arrays, variables, or the like.
- the control circuitry 20 communicates with the memory 22 .
- the memory 22 may comprise any buffer, computing device, or system capable of storing computer instructions and data for access by another computing device or a computer processor, such as, for example, the control circuitry 20 .
- the memory 22 comprises random access memory (RAM).
- the memory 22 may comprise other integrated and accessible memory devices, such as, for example, read-only memory (ROM), programmable ROM (PROM), and electrically erasable programmable ROM (EEPROM).
- the memory 22 comprises a removable memory, such as a floppy disk, a compact disk (CD), a ZIP® disk, a DVD, a removable drive or the like.
- the memory 22 stores data files that include data or information regarding the content of particular media.
- a data file or a group of related data files contains information specific to a particular movie or prerecorded video footage.
- data files for a particular movie may be referenced by using the movie's unique serial number.
- the information contained in the data files may identify scenes or segments of the particular movie or video that have been catalogued as being suitable for video compositing.
- the data files may also contain content relating to dialogue prompts for particular characters, menu options, and other data relating to scenes available for video compositing.
- the data files contain reference information that identifies particular frames or points in a video source that may be later used in the compositing process.
- the reference information is used by the composting device 12 to identify the frames of a movie scene in which a particular character is present.
- the reference information may be used to trigger multiple signals for use in the compositing process.
- the reference information may contain both beginning and ending reference points, wherein the beginning reference point indicates the commencement of a particular feature in the video source, such as the entrance of a character into a scene, and wherein the ending reference point identifies the ending of a particular feature, such as when the character leaves the scene.
- reference information is a broad term and is used in its ordinary sense and includes without limitation any type, or combination of types, of data that stores or contains information regarding particular media.
- reference information may comprise reference points that identify scenes containing particular characters.
- reference information is not limited to such reference points.
- reference information may comprise code, symbols, alphanumeric information, or the like that represent a song, a particular event, or a particular image that is contained or represented in particular media, such as an audiovisual signal.
- data files may be stored in the memory 22 .
- data files stored in the memory 22 may contain information relating to the individual frames of a particular movie, such as Star Wars.
- the preprogrammed data files associated with Star Wars would identify scenes in the movie that have been selected as being suitable as a background scene for video compositing.
- the data files may identify when a particular character, such as Darth Vader®, is present in a specific scene. This data file information allows for the user to chose between pre-selected scenes and facilitates the generation of a signal by the control circuitry 20 to be used in coordinating and creating a composite video signal.
- the data files may be preprogrammed in the memory 22 . In other embodiments, the data files may be later saved in the memory 22 by the user.
- the data files need not be stored in the memory 22 .
- the data files may be generated in real time, generated on the fly, derived or received from an external source or device, or generated by the compositor device 12 itself.
- data files for particular media could be downloaded from the Internet, transferred from a removable storage medium, or even programmed by the user.
- the data files may also be embedded in a closed-caption signal or be saved on a DVD, such as in a bonus material section, that contains the corresponding movie.
- the memory 22 is depicted as being external to the control circuitry 20 , in other embodiments of the invention, the memory 22 may be internal to the control circuitry 20 .
- the memory 22 may exist as a cache in the control circuitry 20 .
- the memory 22 may also be external to the compositor device 12 .
- the memory 22 comprises an external hard drive.
- the memory 22 may also comprise multiple memory devices for storing data or information.
- the DVD player 24 is one embodiment of a video source for the compositor device 12 .
- the DVD player 24 functions as a general purpose DVD player and outputs video and audio content stored on a DVD to the multiplexer 26 .
- the DVD player 24 also includes a counter that adjusts itself based on which frame of the DVD is being read. For example, the DVD counter may correlate each frame of the DVD with a specific time code relating to the media stored on the DVD. This counter enables the DVD player 24 to jump to, identify or read specific frames of video content stored on DVDs.
- the DVD player 24 also reads DVD serial numbers so as to identify the media content contained by the particular DVD and may communicate the serial number to the control circuitry 20 .
- the functioning of the DVD player 24 may be controlled by the control circuitry 20 , which loads appropriate data files from the memory 22 .
- the DVD player reads the serial number of the DVD and communicates the number to the control circuitry 20 .
- the control circuitry 20 uses the serial number to find the appropriate data files stored in the memory 22 .
- the data files identify the media content of the DVD as being the Star Wars movie and also identify which scenes, or frames, are to be played by the DVD player 24 . In this way, the control circuitry 20 is able to manage the functioning of the DVD player 24 .
- the DVD player 24 also reads DVDs that contain data as opposed to a movie or a video.
- the DVD player 24 is used to read DVDs that contain data files that are associated with several movies or videos, which data files may be copied to the memory 22 .
- the multiplexer 26 is configured to accept signals from multiple external sources as well as from the DVD player 24 .
- the multiplexer 26 is shown as being structured to receive signals from the DVD player 24 , a cable network, an antenna and a satellite.
- the multiplexer 26 may be configured to receive fewer or more signals.
- multiplexer 26 may be configured to receive a streaming video over the Internet or data from a cable box.
- the multiplexer 26 may also be configured to receive auxiliary signals from an external DVD player or a VCR.
- the multiplexer 26 is structured to select one of multiple input signals, the selection being based on a control signal.
- the control circuitry 20 supplies the control signal to the multiplexer 26 .
- the multiplexer 26 automatically selects the signal from the DVD player 24 when a DVD is inserted therein and selects the signal from the other available signals when no DVD is present in the DVD player 24 .
- the user may input a selection through the user interface 32 .
- one embodiment of the multiplexer 26 outputs the selected signal to the chroma-key mixer 28 and the switcher 30 .
- other switching devices or routers may be used in place of the multiplexer 26 to select between multiple input signals and to communicate the selected signal to other components.
- the multiplexer 26 may receive signals from other video sources.
- the multiplexer 26 may also be coupled to more or fewer video sources than are depicted in FIG. 2 .
- chroma-key as used herein is a broad term and is used in its ordinary sense and refers to without limitation a system, device, or process that is used to create an effect wherein at least one color or hue in a video image is eliminated or substituted for with a different image.
- a chroma-key technique also referred to as color separation overlay, may utilize a mixer or like device to substitute a color, such as blue or green, in one video image for portions another video image.
- the chroma-key mixer 28 receives, as inputs, signals from the multiplexer 26 and from the video recorder 16 .
- the chroma-key mixer 28 processes these two input signals to form an output composite signal, which is communicated to the switcher 30 .
- the chroma-key mixer 28 may also receive control signals from the control circuitry 20 or from the user interface 32 .
- the chroma-key mixer 28 is used to create special visual effects that utilize the combination of two video signals to produce one composite image.
- the chroma-key mixer 28 is used to produce a composite image wherein it appears the subject from one video source, such as footage being captured by a video camera, is inserted into the footage from another video source, such as a movie on a DVD. This mixing by the chroma-key mixer 28 may be accomplished in real time.
- the chroma-key mixer 28 produces a composite image by subtracting a chroma element or elements from the real-time image, such as an image being captured by the video recorder 16 .
- the chroma element comprises at least one color that has been pre-selected or that is selected by the user, which color is used in the background for the video-recorded image.
- the chroma-key mixer 28 Upon receiving the real-time signal, removes the chroma element (the background) from the video recorded image, leaving only the image of the target subject.
- a target subject is positioned in front of a solid green screen.
- the image of the target subject is then captured by a video recorder and transmitted as a signal to the chroma-key mixer 28 .
- the chroma-key mixer 28 subtracts the chroma element (green) from the video recorder signal. This leaves only the image of the target subject along with “blank” portions where the real-time image had contained the chroma element.
- the chroma-key mixer 28 then replaces the subtracted, or blank, portions of the real-time image with portions of the image contained by the signal from the multiplexer 26 , which signal contains scenes from a movie on a DVD.
- the target subject image which is a real-time image
- the composite image is made up of at least two video components: a foreground image, which consists of the non-chroma element portions of the video recorder signal, and a background image, which consists of the signal received from the multiplexer 26 .
- the chroma-key mixer 28 may directly substitute portions of the video source signal for the chroma-element portions of the real-time video signal.
- the chroma-element portions of the real-time video signal are made transparent by the chroma-key mixer 28 . This allows the non-chroma element portions of the real-time video signal to be layered on top of the video source signal to create the composite image.
- the example given above has been in reference to an embodiment of the invention utilizing green as the chroma element, other colors may be used. For example, blue or red may be designated as the chroma element. In addition, multiple shades of the same color may be identified as chroma elements, allowing for a finer tuning of the composite image by the chroma-key mixer 28 .
- the user is able to select portions (or colors) of the real-time image that the user wishes to remove or make transparent by designating the colors as chroma elements.
- the chroma-key mixer 28 may perform the above-described process through various techniques.
- the chroma-key mixer 28 utilizes digital processing to create the composite image.
- the chroma-key mixer 28 may create the composite image through optical techniques or through the use of analog real-time circuits that are known in the art.
- the chroma-key mixer 28 comprises a luminance key mixer, which performs video compositing based on the brightness of portions of an image instead of color.
- the chroma-key mixer 28 outputs the composite signal to the switcher 30 .
- the switcher 30 also receives the output signal of the multiplexer 26 .
- the function of the switcher 30 is to select a single output signal from multiple input signals. For example, in one embodiment, the switcher 30 selects between the signal from the multiplexer 26 and the composite signal from the chroma-key mixer 28 .
- the switcher 30 makes its determination based on communications with the control circuitry 20 . In particular, the operation of the switcher 30 is managed by the control circuitry 20 based on the information contained in the data files, such as reference information regarding the beginning and ending reference points.
- the control circuitry 20 cross-references each frame of a prerecorded video with the beginning and ending reference points contained in the data files corresponding to the specific video being played.
- the switcher 30 is automatically instructed to select the composite signal from the chroma-key mixer.
- a target image being captured by the video recorder 16 is “inserted” or superimposed in the prerecorded video scene.
- the switcher 30 is automatically instructed to select the signal from the multiplexer 26 , thus removing the image of the target subject from the prerecorded video scene.
- the data files for the movie Star Wars that contain information relating to the video footage of Darth Vader® are accessed by the control circuitry 20 .
- the signal from the multiplexer 26 (which comes from the DVD player. 24 ) is selected by the switcher 30 , which signal may be shown by the display 18 . Viewers of the display 18 will see the normal footage from the Star Wars movie.
- the signal from the multiplexer 26 is selected by the switcher 30 until the control circuitry 20 instructs the switcher 30 to select the composite signal from the chroma-key mixer 28 .
- This switching to the composite signal occurs when video footage of Darth Vader® is contained in the video source signal.
- the control circuitry 20 can identify the footage containing Darth Vader® by cross-referencing the relevant beginning and ending reference points from the data files.
- the beginning reference points identify the points or times in the movie when Darth Vader® enters a scene.
- the control circuitry 20 instructs the switcher 30 to select as an output the composite signal from the chroma-key mixer 28 .
- viewers see in his place the image of the real-time target subject, which is being captured by the video recorder 16 .
- the ending reference points identify the points or times in a movie when Darth Vader® leaves a movie scene.
- the control circuitry 20 then instructs the switcher 30 to select as an output the signal from the multiplexer 26 .
- the target image which is being captured by the video recorder 16 , is not shown on the display 18 .
- the reference information is used in one embodiment of the invention to automatically control the switching process between the signal from the video source 14 and the composite signal from the chroma-key mixer 28 .
- the reference information comprises beginning and ending reference points that correspond to the presence of a particular character in a movie or that indicate other points when it would be desirable to superimpose a real-time target image on a prerecorded image.
- the reference information of the data files may also be used to manage the audio components of the signals received from the multiplexer 26 and the video recorder 16 .
- control circuitry 20 instructs the switcher 30 to: (1) include only the audio component of the signal from the multiplexer 26 in the output signal, (2) include only the audio component of the signal from the video recorder 16 in the output signal, or (3) include both the audio components of the signals from the multiplexer 26 and the video recorder 16 in the output signal.
- the reference information is also used to manage the display of voice prompts, such as are used in karaoke files.
- the reference information may indicate when to show voice prompts for a particular character when the character enters a scene.
- the reference information may also indicate when to remove or not display voice prompts for the particular character.
- the reference information corresponding to voice prompts may be located in the same data file as, or in a separate data file from, the reference information corresponding to video or audio components of the video source.
- beginning reference points and “ending reference points” are used herein to describe the functioning of the compositing system 10 , one with skill in the art will recognize that the beginning and ending reference points may be structurally and functionally equivalent.
- reference points stored in the data files are not identified as beginning or ending reference points.
- the reference points may be used by the control circuitry 20 to output a signal that causes the switcher 30 to change its state no matter what state the switcher was operating in previously.
- the depicted embodiment of the compositor device 12 comprises the user interface 32 .
- the user interface 32 comprises any interface that accepts input from a user and/or conveys information to a user.
- the user interface 32 is coupled to the chroma-key mixer 28 and to the control circuitry 20 .
- the user interface 32 may be coupled to more or fewer components of the compositor device 12 .
- the user interface may be directly coupled to the DVD player 24 to control the operation of the DVD player without the use of the control circuitry 20 .
- the user interface 32 comprises a front tray portion of the DVD player 24 , a display 40 , editing controls 42 , cropping/chroma controls 44 and a camera input display 46 .
- the user interface 32 may comprise more or fewer components.
- the user interface 32 may operate without the display 40 or without the editing controls 42 .
- the display 40 conveys to the user information regarding the operation of the compositor device 12 .
- the display 40 may depict information regarding the tracks of an inserted DVD (either by time or by frame number), the chroma color selections, the data files (such as the film title or the tracks/scenes available for substitution) and other information to assist the user.
- the display 40 is a light emitting diode (LED) display.
- the display 40 is a liquid crystal display (LCD).
- the display 40 may convey information through the use of graphics, characters, or both.
- the cropping/chroma controls 44 allow a user to allow a user to modify in real time the video image being captured by the video recorder 16 so that the image conforms to the prerecorded video scene in which the image is inserted.
- the cropping/chroma controls 44 allow the user to select the chroma element or elements to be subtracted from the captured video image. Such selection may be made by selecting the name of a particular color or by selecting a visual representation of the color that is shown on the user interface display 40 or the external display 18 .
- the cropping/chroma controls 44 also allow the user to crop the captured video image such that it “fits” in the prerecorded background image. These controls may be used to zoom out or zoom in on a target subject in order to adjust the size of the target subject to be in proportion with other objects in the prerecorded scene on to which the target subject is superimposed.
- the user interface 32 may comprise a color saturation control that adjusts the color level of the captured video. This allows for a color image to be adjusted so as to blend in a black and white background.
- the camera input display 46 identifies the video recorders 16 that are connected to the compositor device 12 and that are available to capture video for processing. For example, if multiple video recorders 16 were coupled to the compositor device 12 , then multiple lights of the camera input display 46 may be illuminated. In other embodiments of the invention, the camera input display 46 identifies when the video being captured by one of the video recorders 16 is being processed and output to the display 18 . In other embodiments of the invention, the user interface 32 does not include the camera input display 46 .
- the user interface 32 also comprises controls that are generally found on CD/DVD players.
- the user interface comprises a power button 48 to turn the machine on and off.
- the user interface also comprises DVD/CD controls 50 , such as play, rewind, fast forward, stop, pause, eject and the like, that are used to control the operation of the DVD player 24 .
- the user interface 32 also includes a remote control input (not shown).
- the remote control input may accept instructions or data that are transmitted to the user interface 32 from one or more remote control devices. These instructions may correspond to controls that are present on the user interface 32 or may include more or fewer instructions that enable the user to manage the operation of the compositor device 12 .
- FIG. 3 depicts one implementation of the user interface 32
- the user interface 32 may comprise a touch screen that both displays information to a user and accepts input from the user.
- the user interface 32 may accept instructions through voice recognition or may be coupled to another system or device, such as a keyboard or personal computer, that accepts input from a user.
- the compositor device 12 operates without a user interface 32 .
- a user interface may be incorporated into the display 18 .
- display 18 may show on-screen instructions to a user or accept commands inputted by the user.
- the control circuitry 20 manages the content of the signal outputted to the display 18 .
- the output of the switcher 30 is coupled to the control circuitry 20 .
- the control circuitry 20 may select to output the signal from the switcher 30 or may output other video content that is stored in the memory 22 .
- prerecorded scenes are stored in the memory 22 , and upon input from the user, the control circuitry 20 outputs these scenes instead of the signal from the switcher 30 .
- the control circuitry 20 is used to overlay information such as subtitles or dialogue prompts over the signal from the switcher 30 .
- the output of the switcher 30 is coupled to the display 18 without intervention by the control circuitry 20 .
- the compositor device 12 includes input/output ports (not shown) that interface with external systems or devices.
- the compositor device 12 may comprise a serial port, a universal serial bus (USB), a firewire port, or the like that can be used to download data files from a personal computer or handheld device.
- USB universal serial bus
- FIG. 4 is a block diagram of an embodiment of the invention wherein the compositor device 12 is structured to receive input from multiple video recorders.
- the compositor device 12 includes the same components as the embodiment of the compositor device 12 depicted in FIG. 2 .
- the compositor device 12 of FIG. 4 additionally comprises a second chroma-key mixer 54 and a second switcher 56 .
- the second chroma-key mixer 54 functions similarly to the first chroma-key mixer 28 .
- the second chroma-key mixer 54 receives, as inputs, signals from the multiplexer 26 , the first chroma-key mixer 28 and a second video recorder 16 .
- the second chroma-key mixer 54 may also receive instructions from the control circuitry 20 .
- the second chroma-key mixer 54 removes the chroma element from a real-time image, such as captured by the second video recorder, and combines the modified real-time image with another signal to form a composite signal.
- the second chroma-key mixer 54 may combine the real-time image captured by the second video recorder with either the prerecorded video from the multiplexer 26 or the composite signal outputted by the first chroma-key mixer 28 .
- the second chroma-key mixer 54 then outputs a second composite signal to the second switcher 56 .
- the second chroma-key mixer 54 may be external to the compositor device 12 . In yet other embodiments, portion of the second chroma-key mixer 54 may be external to the compositor device 12 and portions of the second chroma-key mixer 54 may be internal to the compositor device 12 .
- the second switcher 56 functions similarly to the first switcher 30 .
- the second switcher 56 receives, as inputs, signals from the first switcher 30 and from the second chroma-key mixer 54 .
- the second switcher 56 selects between these inputs based upon instructions received from the control circuitry 20 .
- the second switcher 56 may output a signal to the control circuitry 20 or to the display 18 .
- the content of the output signal of the second switcher 56 may be: (1) the prerecorded signal from the multiplexer 26 , (2) the composite signal from the first chroma-key mixer 28 having portions of an image from the first video recorder, (3) the second composite signal from the second chroma-key mixer 54 having portions of an image from the second video recorder, or (4) the second composite signal from the second chroma-key mixer 54 having portions of images from the first video recorder and from the second video recorder.
- FIG. 5 illustrates one embodiment of an interactive video compositing process 100 .
- the compositing process 100 begins with State 105 .
- the user selects a video source that the user wants to use as a background image for a final composite image. For example, the user may insert a favorite movie into the DVD player 24 .
- the compositing process 100 continues with State 110 .
- data files associated with the video source are accessed by the compositor device 12 .
- the serial number of the DVD is communicated to the control circuitry 20 , which uses the serial number to identify and access the appropriate prerecorded data files.
- the data files identify scenes recorded on the DVD that have been catalogued as being suitable for chroma key substitution.
- the suitable scenes are identified by the data files stored in the memory 22 of the compositor device 12 .
- These data files may have been preprogrammed in the memory 22 , may have been downloaded or saved from another source, or may be saved on the selected DVD, such as in the bonus materials section of the DVD.
- the scenes catalogued as being available for chroma-key substitution generally contain video footage of a character for which a user can substitute his or her own image.
- the data files comprise reference information that identifies which scenes of the DVD contain video footage of the particular character.
- the available scenes are communicated to the user, such as through the display 40 of the user interface 32 .
- the available scenes may be communicated through an external display, such as display 18 .
- the user selects an available scene into which the user wants to superimpose or “insert” a real-time image.
- the user makes the selection through the user interface 32 , such as through a remote control.
- the compositing process 100 proceeds with State 120 .
- the video recorder 16 is used to capture a target image.
- This target image is the image that is used to overlay, or be inserted into, in real time, the scenes from the video source.
- the target image is preferably positioned in front of an evenly lit, solid colored background, which color represents the chroma element.
- the selected chroma element is green
- the target image is positioned in front of a “green screen.”
- Other colors or types of backgrounds may be used that enable the background to be later “removed” when forming a composite image with scenes from the video source.
- a video camera is utilized to capture the target image and convert the image into a signal that is input into the chroma-key mixer 28 of the compositor device 12 . It should be recognized that, along with a video image, an audio signal may also be recorded by the video camera and input into the compositor device 12 .
- the chroma-key mixer 28 of the compositor device 12 creates a composite image through the processes that have been previously discussed.
- the chroma-key mixer 28 removes the chroma element from the image captured by the video camera, leaving only the video image of the target image. Then, corresponding portions of the image from the video source are used to fill in the removed portions of the video recorded image, thus forming a composite image that appears to have the target image inserted into scenes from the video source.
- the chroma element portions of the video-captured image are made transparent, such as through digital processing, leaving only the target image.
- the modified video-captured image is then layered on top of the video source image.
- portions from the background video source image that are located underneath the top layer target image are not seen on the display of the composite image.
- portions of the background video source image located underneath the transparent portions of the top layer are viewable.
- the compositing process 100 then moves to State 125 .
- this selection of the output image is made by the switcher 30 .
- the control of this selection may be performed automatically (without user interaction) by the compositor device 12 using information from the prerecorded data files, or the user may control the operation of the switcher 30 through the user interface 32 .
- the control circuitry 20 based upon the beginning and ending reference points contained in the data files, may instruct the switcher 30 when to output the video source image and when to output the composite image that has the target image overlaid on the video source image.
- the user may also manually control when such switching occurs. Such manual control allows the user more freedom to create desired scenes or special effects.
- the switcher 30 is instructed to select the composite image
- the compositing process 100 moves to State 130 .
- the composite image is shown on the display 18 . Viewers of the display 18 will observe the real-time target image inserted into the prerecorded footage from the video source. For example, viewers may see the target image replacing a character in a movie playing on the DVD player 24 .
- the compositing process 100 moves to State 140 .
- the switcher 30 is instructed to select the video source image
- the compositing process 100 moves to State 135 .
- the image from the video source is shown on the display 18 .
- the compositing process 100 then proceeds to State 140 .
- preprogrammed data may include, for example, prerecorded scenes that are stored in the memory 22 .
- the prerecorded scenes comprise video clips that users may want to insert in order make the displayed scenes appear more interactive or to appear more life-like.
- prerecorded video clips having various forms of feedback from judges could be inserted after a target subject has acted out a scene (which was observed by the viewers of the display 18 ).
- different video clips are selected to be displayed based on input given by the viewers.
- the compositing process 100 moves to State 145 .
- the preprogrammed data is communicated to the display 18 .
- the control circuitry 20 manages which signal is communicated to the display 18 .
- a multiplexer or other similar device may be used to select which signal is output to the display 18 .
- the length of time that the preprogrammed data is displayed may be directly controlled by the user or may be a set length of time, such as, for example, until the end of the particular video clip.
- the compositing process 100 Upon completion of State 145 , the compositing process 100 returns to State 140 . If preprogrammed data is not to be displayed, the compositing process 100 returns to State 125 to determine whether the video source image or the composite image is to be displayed.
- the compositing process 100 illustrated in FIG. 5 is only one example of the functioning of the compositing system 10 .
- the compositing process 100 may include more or fewer states, or states may be combined or executed in a different order.
- additional states may be added that illustrate the separate control of audio signals and video signals.
- preprogrammed data may be displayed at the beginning of the compositing process 100 or upon the selection of a particular video source for playback.
- the compositing system 10 may be particularly useful with preprogrammed video that is easily adapted to allow for user interaction.
- the user selects a DVD that has recorded scenes from the television talent show American Idol. Data files corresponding to video segments of the DVD are stored in the memory 22 of the compositor device 12 .
- the user is provided with options of scenes that are available for user interaction. For example, the user may have the option to select different scenes in which the user may “perform” in front of the judges or an audience.
- the user selects a scene for video compositing. For example, the user may pick a scene in which a contestant is performing by signing a song in front of the judges. A user whose image is going to be substituted into the American Idol footage is positioned in front of the video recorder 16 with a green screen serving as the background. The individual then performs as if he or she was actually participating on the American Idol program, the performance of the individual being captured by the video recorder 16 and converted to a signal communicated to the compositor device 12 .
- the display 18 shows video and audio from the American Idol program.
- the image of the individual being recorded by the video recorder 16 is substituted in real time for the participant.
- the timing of the substitution of images is determined by the reference information recorded in the data files, which process has been described previously.
- the user may also manually control which images are output to the display 18 .
- Audio signals that are captured by the video recorder 16 are also output through the display 18 .
- the substitution of the real-time audio signals from the video recorded footage may occur at appropriate points in the American Idol scenes, such as when the participant is performing or singing.
- the audio substitution need not occur at the same times as the video substitution. For example, there may exist portions of American Idol footage that contain the voice of the participant but that do not contain the video image of the participant.
- the substitution of the audio signals may be automatically controlled by the compositor device 12 based on the data file information and/or may be manually controlled by the user.
- viewers After the performance, viewers have the option to rate the performance of the individual who has been inserted into the program. These viewer ratings may be used to select the display of prerecorded video clips having feedback from the judges on American Idol. For example, prerecorded video clips of good reviews, bad reviews, and average reviews may be stored in the memory 22 of the compositor device 12 . The viewers then have the option of inputting their opinions of the performance, such as through remote controls communicating with the user interface 32 . If the viewers rate the performance by the individual as being generally poor, then the compositor device 12 selects the playback of video clips that include the judges being critical of the performance. On the other hand, if the users rate the performance as being generally good, then the compositor device 12 selects the playback of video clips that give positive feedback from the judges.
- the above-described American Idol program may be stored entirely in the memory 22 of the compositor device 12 without the use of a DVD.
- the compositor device 12 may include a monitor that display the appropriate voice prompts to the user, which prompts correspond to portions of the American Idol program.
- the compositor device 12 may include a CD+G input or player that reads files and outputs a display having voice prompts.
- the compositor device may accept video or audio input from a video game system. Such would allow a user to “insert” himself or herself into the video game and to interact with objects therein.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Television Signal Processing For Recording (AREA)
- Studio Circuits (AREA)
Abstract
An interactive video compositing device includes a chroma-key mixer, video switcher and control circuitry. The chroma-key mixer generates a composite image by combining a real-time image, such as one captured by a video recorder, with a prerecorded video image, such as a movie. The composite image includes the modified real-time image superimposed, or overlaid, onto the prerecorded image. The video switcher automatically selects either the composite image or the prerecorded image to be output to a display. The control circuitry controls the video switcher and other outputted signals based on data file information that corresponds to content of the prerecorded image or media. For example, the data files may contain information relating to the presence (or absence) of a particular character in a movie scene, thus allowing for the output and display, at appropriate times, of the real-time composite image instead of the prerecorded image.
Description
- This application is a continuation of, and claims priority benefit under 35 U.S.C. §120 from, U.S. patent application Ser. No. 10/836,729, filed Apr. 30, 2004, which claims the benefit of priority under 35 U.S.C. §119(e) of U.S. Provisional Application No. 60/467,480, filed on May 2, 2003, and entitled “INTERACTIVE SYSTEM FOR REAL-TIME CHROMA KEY COMPOSITING,” the entirety of each of which is incorporated herein by reference and is to be considered part of this specification.
- 1. Field
- Preferred embodiments of the invention relate to a system and method for signal compositing, and more particularly, to a system and method for interactive video compositing.
- 2. Description of the Related Art
- Interactive entertainment is a popular leisure activity for people across the globe. One favorite activity for many is karaoke, which temporarily turns lay persons into “stars” as they sing the lyrics to a favorite song. Karaoke machines play the music of a selected song while simultaneously displaying the song lyrics to a user, thus allowing the user to sing along with the background music.
- Another favorite leisure activity for millions is watching movies. Billions of dollars are spent each year on ticket sales for new box office hits. In addition, an increasing number of people are beginning to view movies at home. One popular technology that has been recently adopted is the digital video disk (DVD) technology. Movies are recorded in digital format onto a disk that allows for repeated viewings at the convenience of the user. The sound and picture quality of DVDs, along with the convenient features that accompany digital media, have made DVDs a popular replacement to standard video cassette technology.
- However, movie watching has predominantly been a passive activity. That is, one either travels to a theater or inserts a disk into a DVD player and sits back to watch the movie. Though one may watch the same movie repeatedly, each time the same characters appear and recite the same lines and perform the same actions. As of yet, one has not been able to act out his or her favorite scene while appearing to be in the movie itself in a real-time manner.
- Accordingly, there is a need for an interactive system that allows for video images to be inserted into a prerecorded video, such as a movie. In one embodiment of the present invention, chroma-key technology is used to produce a composite video image, wherein it appears that a target image, such as one captured by a video camera, is superimposed in real time onto a prerecorded image. The compositing process is managed by internal circuitry and data files that direct when and how the real-time image is inserted into the prerecorded image.
- In one embodiment, an interactive video compositing device includes a mixer, video switcher and control circuitry. The mixer utilizes chroma-key technology to remove a chroma element from a real-time video image. The mixer combines the modified image with a prerecorded video image to form a composite image, which includes the modified real-time image overlaid on the prerecorded image. The video switcher automatically selects either the composite image or the prerecorded image to be output to a display. This selection is managed by the control circuitry and is based on data file information that corresponds to the content of scenes of the prerecorded image. For example, the data files may contain information relating to the presence (or absence) of a particular character in the movie scene, thus allowing for the output of the real-time composite image instead of the prerecorded image at appropriate times.
- In another embodiment of the invention, a system for real-time, chroma-key compositing comprises a first input structured to receive a first video signal; a second input structured to receive a second video signal; a chroma-key mixer coupled to the first input and to the second input, wherein the chroma-key mixer is structured to output a composite signal comprising portions of the first video signal and portions of the second video signal; a data file associated with the first video signal, wherein the data file has at least first reference data and second reference data; and a switcher that selects between outputting the composite signal and outputting the first video signal, wherein the switcher outputs the composite signal at a time identified by the first reference data, and wherein the switcher outputs the first video signal at a time identified by the second reference data.
- In another embodiment, a reference data system used for video compositing comprises first reference information associated with a first video signal; second reference information associated with the first video signal; and a video switcher configured to switch between outputting either the first video signal or a composite signal that includes portions of the first video signal and portions of a second video signal, wherein the video switcher outputs the composite signal at a time identified by the first reference information, and wherein the video switcher outputs the first video signal at a time identified by the second reference information.
- In another embodiment of the invention, a software program comprises code for controlling the superimposing of a real-time image onto a prerecorded image, wherein said code identifies portions of the prerecorded image during which the real-time image is to be superimposed onto the prerecorded image.
- In another embodiment, a system for real-time video compositing comprises a means for generating a composite signal by chroma-key mixing a first video signal and a real-time second video signal; and a means for switching between the selection of the first video signal and the selection of the composite signal, wherein the means for switching is controlled according to reference information stored in at least one data file that relates to the first video signal.
- In another embodiment, a method for video compositing comprises receiving a first video signal having multiple frames; receiving a second video signal; creating a composite signal by chroma-key mixing the first video signal and the second video signal; and selectively switching, with a video switcher, between outputting the first video signal and outputting the composite signal, wherein said selective switching is performed automatically and is based on a control signal received by the video switcher, wherein the control signal is derived at least in part from prerecorded data that corresponds to frames of the first video signal.
- In another embodiment, a method for coordinating the display of multiple signals comprises receiving a first signal; receiving a second signal; forming a composite signal comprising portions of the first signal and portions of the second signal; receiving with a switcher the first signal and the composite signal; receiving reference data that corresponds to portions of the first signal; and switching between displaying the first signal and displaying the composite signal, wherein the timing of said switching is based on information contained in the reference data.
- In another embodiment, a method for creating a data file used for controlling the superimposing of video images comprises identifying a portion of a first video image during which a second video image is to be superimposed onto the first video image, said first video image portion having a beginning frame and an ending frame; recording in a data file first information that identifies said beginning frame; and recording in the data file second information that identifies said ending frame.
- For purposes of summarizing the invention, certain aspects, advantages and novel features of the invention have been described herein. It is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment of the invention. Thus, the invention may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
-
FIG. 1 illustrates a real-time video compositing system according to one embodiment of the invention. -
FIG. 2 is a block diagram of one embodiment of a compositor device. -
FIG. 3 illustrates a user interface of one embodiment of a compositor device. -
FIG. 4 is a block diagram of an embodiment of a compositor device that is structured to receive input from two video recorders. -
FIG. 5 is a flow chart depicting one embodiment of an interactive video compositing process. - The features of the system and method will now be described with reference to the drawings summarized above. Throughout the drawings, reference numbers are re-used to indicate correspondence between referenced elements. The drawings, associated descriptions, and specific implementation are provided to illustrate embodiments of the invention and not to limit the scope of the invention.
- In addition, methods and functions described herein are not limited to any particular sequence, and the steps or states relating thereto can be performed in other sequences that are appropriate. For example, described steps or states may be performed in an order other than that specifically disclosed, or multiple steps or states may be combined in a single step or state.
-
FIG. 1 illustrates one embodiment of a real-timevideo compositing system 10. The compositingsystem 10 is used to selectively superimpose images. These images may comprise images that can be viewed in real-time, prerecorded images, or a combination thereof. For example, the compositingsystem 10 may superimpose a real-time image onto an image that has been prerecorded, such as a movie. The compositingsystem 10 generally comprises acompositor device 12, avideo source 14, avideo recorder 16, and adisplay 18. In the depicted embodiment, thecompositor device 12 receives inputs signals from thevideo source 14 and thevideo recorder 16 and outputs a signal to thedisplay 18. - The term “compositing” as used herein is a broad term and is used in its ordinary sense and includes without limitation the superimposing or combining of multiple signals, such as, for example, video and/or audio signals, to form a composite signal. The term “compositor” refers to any device or system, implemented in hardware, software, or firmware, or any combination thereof, that performs in whole or in part a compositing function.
- The term “real time” as used herein is a broad term and is used in its ordinary sense and includes without limitation a state or period of time during which some event or response takes place. A real-time system or application produces a response to a particular stimulus within a certain response time. For example, a device processing data in real time may process the data as it is received by the device. A real-time signal is one that is capable of being displayed, played back, or processed within a particular time after being received or captured by a particular device or system. In one embodiment, this particular time is on the order of one millisecond. In other embodiments, the particular time may be longer than one millisecond. For example, the particular time may be on the order of hundreds of milliseconds. In other embodiments of the invention, the particular time may be less than one millisecond. For example, the particular time may be on the order of microseconds. In yet other embodiments of the invention, “real time” refers to events simulated at a speed similar to the speed at which the events would occur in real life.
- The
video source 14 includes any device, system or technology used to generate, receive, capture, read, supply or store video data. In another embodiment, thevideo source 14 may generate an audiovisual signal that includes a video portion that can be processed to produce a video signal (e.g., to produce a visual image) and an audio portion that can be processed to produce an audio signal (e.g., sound at a level high enough to be heard by a human ear). For example, in one embodiment of the invention, thevideo source 14 comprises a digital video disk (DVD) player. In another embodiment, thevideo source 14 comprises a memory that stores data representing video content. In yet other embodiments of the invention, thevideo source 14 may comprise a device that receives a video transmission, such as through a cable network, a satellite dish, an antenna, or a network. For example, thevideo source 14 may comprise a television, a video cassette recorder (VCR), a CD+G player or a digital video recorder. - In other embodiments of the invention, the
compositing system 10 may includemultiple video sources 14, each being coupled to thecompositor device 12. In such an embodiment, thecompositing system 10 may comprise a multiplexer or a switch that selects a signal from one ofmultiple video sources 14. For example, in one embodiment, thecompositing system 10 comprises a DVD player that reads data from a DVD and a cable box that receives a video transmission over a coaxial cable line. A two-input multiplexer may then be used to select between a signal from the DVD player and a signal from the cable box. - The
video source 14 may be coupled to thecompositor device 12 by any medium that provides for video signal transmission. For example, thevideo source 12 may be coupled to thecompositor device 12 through an RCA cable, an S-cable, a coaxial cable, Ethernet, wireless technologies and the like. One with skill in the art will also recognize that thevideo source 14 may supply audio content along with video content. This audio content may be delivered on the same or different mediums as the video content. -
FIG. 1 depicts an embodiment of the invention wherein thevideo source 14 is external to thecompositor device 12. In other embodiments of the invention, thevideo source 14 may be internal to thecompositor device 12. For example, thecompositor device 12 may comprise a DVD player or may comprise a memory having stored video data. In yet other embodiments, thecompositing system 10 may comprise at least onevideo source 14 that is internal and at least onevideo source 12 that is external to thecompositor device 12. - With continued reference to
FIG. 1 , thecompositing system 10 includes thevideo recorder 16. Thevideo recorder 16 comprises any device, system or technology that is capable of converting real-time video images into an electronic signal, such as a digital or an analog signal. In one embodiment, thevideo recorder 16 is capable of the real-time conversion of and transmission of video images. In one embodiment, thevideo recorder 16 is a video camera. For example, thevideo recorder 16 may comprise a camcorder, such as an analog camcorder or a digital camcorder. Thevideo recorder 16 may be coupled to thecompositor device 12 through an RCA cable, an S-cable, a coaxial cable, Ethernet, wireless technologies and the like. - In other embodiments of the invention, the
compositing system 10 may includemultiple video recorders 16, each being coupled to thecompositor device 12. For example, multiple video cameras may be coupled to thecompositor device 12. In addition, thevideo recorder 12 may be internal or external to thecompositor device 12. - The
compositing system 10 also comprises thedisplay 18. Thedisplay 18 receives an output signal from thecompositor device 12 and converts the output signal to at least a video image. For example, in one embodiment of the invention, thedisplay 18 comprises a television that is coupled to thecompositor device 12 through RCA cables. In other embodiments, thedisplay 18 may include a video projector, a monitor or the like and may be coupled to thecompositor device 12 through any medium that provides for video signal transmission, as has been previously described. - In other embodiments of the invention, the
display 18 may also be used to provide instructions or data to the user or users of thecompositing system 10. For example, menu selections or command prompts may be displayed to the user through thedisplay 18. In addition, dialogue prompts, such as are used in general karaoke machines may be portrayed on thedisplay 18 so as to assist a user in reciting the appropriate lines. - In other embodiments of the invention, the
compositing system 10 may comprisemultiple displays 18. Thedisplay 18 may also be internal or external to thecompositor device 12. For example, thecompositor device 12 may include a screen that portrays a video image to the user. Such a screen would allow a user to have visual feedback as to the final output of thecompositing system 10 without having to look at an external display. - In one embodiment, the
compositing system 10 may also comprise a media storage device (not shown) that stores the signal output by thecompositor device 12. For example, thecompositor system 10 may comprise a memory configured to store in digital form a copy of the output signal that is sent to thedisplay 18. In another embodiment, thecompositing system 10 may output a signal only to the media storage device instead of thedisplay 18. In such an embodiment, the output video and audio content could be stored for later playback on another device. In yet other embodiments, the media storage device may be included with thedisplay 18. -
FIG. 2 illustrates a block diagram of one embodiment of thecompositor device 12. Thecompositor device 12 allows a user to selectively overlay images in real time onto a second video image, such as prerecorded video content. In one embodiment, thecompositor device 12 comprisescontrol circuitry 20, amemory 22, aDVD player 24, amultiplexer 26, a chroma-key mixer 28, aswitcher 30 and auser interface 32. - In one embodiment, the components of the
compositor device 12 are modules that comprise logic embodied in hardware or firmware, or that comprise collection of software instructions written in a programming language, such as, for example C++. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpretive language such as BASIC. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software instructions may be embedded in firmware, such as an EPROM or EEPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. For example, in one embodiment, the functions of thecompositor device 12 may be implemented in whole or in part by a personal computer or other like device. - It is also contemplated that the components of the
compositor device 12 need not be integrated into a single box. The components can be separated into several subcomponents or can be separated into different devices that reside at different locations and that communicate with each other, such as through a wired or wireless network, or the Internet. Multiple components may be combined into a single component. It is also contemplated that the components described herein may be integrated into a fewer number of modules. One module may also be separated into multiple modules. - The
control circuitry 20 manages the operation of components of thecompositor device 12. In one embodiment, thecontrol circuitry 20 is a special purpose microprocessor. In other embodiments, thecontrol circuitry 20 may be implemented as an application-specific integrated circuit (ASIC). In yet other embodiments, thecontrol circuitry 20 may be implemented as one or more modules, which modules may be configured to execute on one or more processors. The modules may comprise, but are not limited to, any of the following: hardware or software components such as software object-oriented software components, class components and task components, processes, methods, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, applications, algorithms, techniques, programs, circuitry, data, databases, data structures, tables, arrays, variables, or the like. - The
control circuitry 20 communicates with thememory 22. Thememory 22 may comprise any buffer, computing device, or system capable of storing computer instructions and data for access by another computing device or a computer processor, such as, for example, thecontrol circuitry 20. In one embodiment, thememory 22 comprises random access memory (RAM). In other embodiments, thememory 22 may comprise other integrated and accessible memory devices, such as, for example, read-only memory (ROM), programmable ROM (PROM), and electrically erasable programmable ROM (EEPROM). In another embodiment, thememory 22 comprises a removable memory, such as a floppy disk, a compact disk (CD), a ZIP® disk, a DVD, a removable drive or the like. - In one embodiment, the
memory 22 stores data files that include data or information regarding the content of particular media. In one embodiment, a data file or a group of related data files contains information specific to a particular movie or prerecorded video footage. For example, data files for a particular movie may be referenced by using the movie's unique serial number. The information contained in the data files may identify scenes or segments of the particular movie or video that have been catalogued as being suitable for video compositing. The data files may also contain content relating to dialogue prompts for particular characters, menu options, and other data relating to scenes available for video compositing. - In one embodiment, the data files contain reference information that identifies particular frames or points in a video source that may be later used in the compositing process. For example, in one embodiment the reference information is used by the
composting device 12 to identify the frames of a movie scene in which a particular character is present. In one embodiment, the reference information may be used to trigger multiple signals for use in the compositing process. For example, the reference information may contain both beginning and ending reference points, wherein the beginning reference point indicates the commencement of a particular feature in the video source, such as the entrance of a character into a scene, and wherein the ending reference point identifies the ending of a particular feature, such as when the character leaves the scene. - The term “reference information” as used herein is a broad term and is used in its ordinary sense and includes without limitation any type, or combination of types, of data that stores or contains information regarding particular media. For example, as described above, reference information may comprise reference points that identify scenes containing particular characters. However, reference information is not limited to such reference points. In other embodiments of the invention, reference information may comprise code, symbols, alphanumeric information, or the like that represent a song, a particular event, or a particular image that is contained or represented in particular media, such as an audiovisual signal.
- In one embodiment, data files may be stored in the
memory 22. For example, data files stored in thememory 22 may contain information relating to the individual frames of a particular movie, such as Star Wars. In one embodiment, the preprogrammed data files associated with Star Wars would identify scenes in the movie that have been selected as being suitable as a background scene for video compositing. The data files may identify when a particular character, such as Darth Vader®, is present in a specific scene. This data file information allows for the user to chose between pre-selected scenes and facilitates the generation of a signal by thecontrol circuitry 20 to be used in coordinating and creating a composite video signal. The data files may be preprogrammed in thememory 22. In other embodiments, the data files may be later saved in thememory 22 by the user. - In other embodiments, the data files need not be stored in the
memory 22. The data files may be generated in real time, generated on the fly, derived or received from an external source or device, or generated by thecompositor device 12 itself. For example, data files for particular media could be downloaded from the Internet, transferred from a removable storage medium, or even programmed by the user. The data files may also be embedded in a closed-caption signal or be saved on a DVD, such as in a bonus material section, that contains the corresponding movie. - Although the
memory 22 is depicted as being external to thecontrol circuitry 20, in other embodiments of the invention, thememory 22 may be internal to thecontrol circuitry 20. For example, thememory 22 may exist as a cache in thecontrol circuitry 20. Thememory 22 may also be external to thecompositor device 12. For example, in one embodiment, thememory 22 comprises an external hard drive. Thememory 22 may also comprise multiple memory devices for storing data or information. - The
DVD player 24 is one embodiment of a video source for thecompositor device 12. TheDVD player 24 functions as a general purpose DVD player and outputs video and audio content stored on a DVD to themultiplexer 26. TheDVD player 24 also includes a counter that adjusts itself based on which frame of the DVD is being read. For example, the DVD counter may correlate each frame of the DVD with a specific time code relating to the media stored on the DVD. This counter enables theDVD player 24 to jump to, identify or read specific frames of video content stored on DVDs. TheDVD player 24 also reads DVD serial numbers so as to identify the media content contained by the particular DVD and may communicate the serial number to thecontrol circuitry 20. - The functioning of the
DVD player 24 may be controlled by thecontrol circuitry 20, which loads appropriate data files from thememory 22. For example, when a Star Wars DVD is placed in theDVD player 24, the DVD player reads the serial number of the DVD and communicates the number to thecontrol circuitry 20. Thecontrol circuitry 20 uses the serial number to find the appropriate data files stored in thememory 22. The data files identify the media content of the DVD as being the Star Wars movie and also identify which scenes, or frames, are to be played by theDVD player 24. In this way, thecontrol circuitry 20 is able to manage the functioning of theDVD player 24. - In one embodiment of the invention, the
DVD player 24 also reads DVDs that contain data as opposed to a movie or a video. For example, in one embodiment, theDVD player 24 is used to read DVDs that contain data files that are associated with several movies or videos, which data files may be copied to thememory 22. - With continued reference to
FIG. 2 , other video sources, such as external video sources, may communicate with thecompositor device 12. In one embodiment, themultiplexer 26 is configured to accept signals from multiple external sources as well as from theDVD player 24. For example, themultiplexer 26 is shown as being structured to receive signals from theDVD player 24, a cable network, an antenna and a satellite. In other embodiments, themultiplexer 26 may be configured to receive fewer or more signals. For example,multiplexer 26 may be configured to receive a streaming video over the Internet or data from a cable box. Themultiplexer 26 may also be configured to receive auxiliary signals from an external DVD player or a VCR. - The
multiplexer 26 is structured to select one of multiple input signals, the selection being based on a control signal. In one embodiment, thecontrol circuitry 20 supplies the control signal to themultiplexer 26. In another embodiment, themultiplexer 26 automatically selects the signal from theDVD player 24 when a DVD is inserted therein and selects the signal from the other available signals when no DVD is present in theDVD player 24. For example, in one embodiment, the user may input a selection through theuser interface 32. - As shown in
FIG. 2 , one embodiment of themultiplexer 26 outputs the selected signal to the chroma-key mixer 28 and theswitcher 30. In other embodiments of the invention, other switching devices or routers may be used in place of themultiplexer 26 to select between multiple input signals and to communicate the selected signal to other components. In other embodiments of the invention, themultiplexer 26 may receive signals from other video sources. Themultiplexer 26 may also be coupled to more or fewer video sources than are depicted inFIG. 2 . - The term “chroma-key” as used herein is a broad term and is used in its ordinary sense and refers to without limitation a system, device, or process that is used to create an effect wherein at least one color or hue in a video image is eliminated or substituted for with a different image. For example, a chroma-key technique, also referred to as color separation overlay, may utilize a mixer or like device to substitute a color, such as blue or green, in one video image for portions another video image.
- The chroma-
key mixer 28 receives, as inputs, signals from themultiplexer 26 and from thevideo recorder 16. The chroma-key mixer 28 processes these two input signals to form an output composite signal, which is communicated to theswitcher 30. The chroma-key mixer 28 may also receive control signals from thecontrol circuitry 20 or from theuser interface 32. - In one embodiment of the invention, the chroma-
key mixer 28 is used to create special visual effects that utilize the combination of two video signals to produce one composite image. In particular, the chroma-key mixer 28 is used to produce a composite image wherein it appears the subject from one video source, such as footage being captured by a video camera, is inserted into the footage from another video source, such as a movie on a DVD. This mixing by the chroma-key mixer 28 may be accomplished in real time. - In one embodiment, the chroma-
key mixer 28 produces a composite image by subtracting a chroma element or elements from the real-time image, such as an image being captured by thevideo recorder 16. The chroma element comprises at least one color that has been pre-selected or that is selected by the user, which color is used in the background for the video-recorded image. Upon receiving the real-time signal, the chroma-key mixer 28 removes the chroma element (the background) from the video recorded image, leaving only the image of the target subject. - For example, in so-called “green screening,” a target subject is positioned in front of a solid green screen. The image of the target subject is then captured by a video recorder and transmitted as a signal to the chroma-
key mixer 28. The chroma-key mixer 28 subtracts the chroma element (green) from the video recorder signal. This leaves only the image of the target subject along with “blank” portions where the real-time image had contained the chroma element. The chroma-key mixer 28 then replaces the subtracted, or blank, portions of the real-time image with portions of the image contained by the signal from themultiplexer 26, which signal contains scenes from a movie on a DVD. As a result of the signal processing, it appears that the target subject image, which is a real-time image, is present in a movie or other prerecorded video, thus forming a composite image. In other words, the composite image is made up of at least two video components: a foreground image, which consists of the non-chroma element portions of the video recorder signal, and a background image, which consists of the signal received from themultiplexer 26. - Though this example describes the functioning of one embodiment of the chroma-
key mixer 28, one with skill in the art will recognize that there exist other processes that may be used to produce a composite signal. For example, the chroma-key mixer 28 may directly substitute portions of the video source signal for the chroma-element portions of the real-time video signal. In another embodiment of the invention, the chroma-element portions of the real-time video signal are made transparent by the chroma-key mixer 28. This allows the non-chroma element portions of the real-time video signal to be layered on top of the video source signal to create the composite image. - Though the example given above has been in reference to an embodiment of the invention utilizing green as the chroma element, other colors may be used. For example, blue or red may be designated as the chroma element. In addition, multiple shades of the same color may be identified as chroma elements, allowing for a finer tuning of the composite image by the chroma-
key mixer 28. For example, in one embodiment of the invention, the user is able to select portions (or colors) of the real-time image that the user wishes to remove or make transparent by designating the colors as chroma elements. - The chroma-
key mixer 28 may perform the above-described process through various techniques. For example, in one embodiment of the invention, the chroma-key mixer 28 utilizes digital processing to create the composite image. In other embodiments, the chroma-key mixer 28 may create the composite image through optical techniques or through the use of analog real-time circuits that are known in the art. In yet other embodiments, the chroma-key mixer 28 comprises a luminance key mixer, which performs video compositing based on the brightness of portions of an image instead of color. - In the embodiment of the invention depicted in
FIG. 2 , the chroma-key mixer 28 outputs the composite signal to theswitcher 30. Theswitcher 30 also receives the output signal of themultiplexer 26. The function of theswitcher 30 is to select a single output signal from multiple input signals. For example, in one embodiment, theswitcher 30 selects between the signal from themultiplexer 26 and the composite signal from the chroma-key mixer 28. Theswitcher 30 makes its determination based on communications with thecontrol circuitry 20. In particular, the operation of theswitcher 30 is managed by thecontrol circuitry 20 based on the information contained in the data files, such as reference information regarding the beginning and ending reference points. - In one embodiment, the
control circuitry 20 cross-references each frame of a prerecorded video with the beginning and ending reference points contained in the data files corresponding to the specific video being played. When the DVD counter, whose value relates to the specific frame being played, matches or correlates with a beginning reference point value, theswitcher 30 is automatically instructed to select the composite signal from the chroma-key mixer. As a result, a target image being captured by thevideo recorder 16 is “inserted” or superimposed in the prerecorded video scene. When the DVD counter matches or correlates with an ending reference point, theswitcher 30 is automatically instructed to select the signal from themultiplexer 26, thus removing the image of the target subject from the prerecorded video scene. - For example, suppose a user wants to insert himself or herself for Darth Vader® in a video clip from Star Wars, which is being played by the
DVD player 24. First, the data files for the movie Star Wars that contain information relating to the video footage of Darth Vader® are accessed by thecontrol circuitry 20. During frames not containing video footage of Darth Vader®, the signal from the multiplexer 26 (which comes from the DVD player. 24) is selected by theswitcher 30, which signal may be shown by thedisplay 18. Viewers of thedisplay 18 will see the normal footage from the Star Wars movie. The signal from themultiplexer 26 is selected by theswitcher 30 until thecontrol circuitry 20 instructs theswitcher 30 to select the composite signal from the chroma-key mixer 28. This switching to the composite signal occurs when video footage of Darth Vader® is contained in the video source signal. Thecontrol circuitry 20 can identify the footage containing Darth Vader® by cross-referencing the relevant beginning and ending reference points from the data files. The beginning reference points identify the points or times in the movie when Darth Vader® enters a scene. As a result, when a beginning reference point matches or correlates with the DVD counter, which identifies a particular point of time or frame in the movie, thecontrol circuitry 20 instructs theswitcher 30 to select as an output the composite signal from the chroma-key mixer 28. Instead of seeing Darth Vader® on thedisplay 18, viewers see in his place the image of the real-time target subject, which is being captured by thevideo recorder 16. - The ending reference points identify the points or times in a movie when Darth Vader® leaves a movie scene. When an ending reference point matches or correlates with the DVD counter, the
control circuitry 20 then instructs theswitcher 30 to select as an output the signal from themultiplexer 26. As a result, the target image, which is being captured by thevideo recorder 16, is not shown on thedisplay 18. - The reference information, therefore, is used in one embodiment of the invention to automatically control the switching process between the signal from the
video source 14 and the composite signal from the chroma-key mixer 28. For example, in one embodiment of the invention the reference information comprises beginning and ending reference points that correspond to the presence of a particular character in a movie or that indicate other points when it would be desirable to superimpose a real-time target image on a prerecorded image. The reference information of the data files may also be used to manage the audio components of the signals received from themultiplexer 26 and thevideo recorder 16. For example, in one embodiment, thecontrol circuitry 20 instructs theswitcher 30 to: (1) include only the audio component of the signal from themultiplexer 26 in the output signal, (2) include only the audio component of the signal from thevideo recorder 16 in the output signal, or (3) include both the audio components of the signals from themultiplexer 26 and thevideo recorder 16 in the output signal. - In another embodiment, the reference information is also used to manage the display of voice prompts, such as are used in karaoke files. For example, the reference information may indicate when to show voice prompts for a particular character when the character enters a scene. The reference information may also indicate when to remove or not display voice prompts for the particular character. In such an embodiment of the invention, the reference information corresponding to voice prompts may be located in the same data file as, or in a separate data file from, the reference information corresponding to video or audio components of the video source.
- Although, the terms “beginning reference points” and “ending reference points” are used herein to describe the functioning of the
compositing system 10, one with skill in the art will recognize that the beginning and ending reference points may be structurally and functionally equivalent. For example, in one embodiment, reference points stored in the data files are not identified as beginning or ending reference points. The reference points may be used by thecontrol circuitry 20 to output a signal that causes theswitcher 30 to change its state no matter what state the switcher was operating in previously. - With continued reference to
FIG. 2 , the depicted embodiment of thecompositor device 12 comprises theuser interface 32. Theuser interface 32 comprises any interface that accepts input from a user and/or conveys information to a user. In one embodiment, theuser interface 32 is coupled to the chroma-key mixer 28 and to thecontrol circuitry 20. In other embodiments, theuser interface 32 may be coupled to more or fewer components of thecompositor device 12. For example, the user interface may be directly coupled to theDVD player 24 to control the operation of the DVD player without the use of thecontrol circuitry 20. - One embodiment of the
user interface 32 is illustrated inFIG. 3 . Theuser interface 32 comprises a front tray portion of theDVD player 24, adisplay 40, editing controls 42, cropping/chroma controls 44 and acamera input display 46. In other embodiments of the invention, theuser interface 32 may comprise more or fewer components. For example, theuser interface 32 may operate without thedisplay 40 or without the editing controls 42. - The
display 40 conveys to the user information regarding the operation of thecompositor device 12. For example, thedisplay 40 may depict information regarding the tracks of an inserted DVD (either by time or by frame number), the chroma color selections, the data files (such as the film title or the tracks/scenes available for substitution) and other information to assist the user. In one embodiment, thedisplay 40 is a light emitting diode (LED) display. In other embodiments, thedisplay 40 is a liquid crystal display (LCD). Thedisplay 40 may convey information through the use of graphics, characters, or both. - The cropping/chroma controls 44 allow a user to allow a user to modify in real time the video image being captured by the
video recorder 16 so that the image conforms to the prerecorded video scene in which the image is inserted. In one embodiment, the cropping/chroma controls 44 allow the user to select the chroma element or elements to be subtracted from the captured video image. Such selection may be made by selecting the name of a particular color or by selecting a visual representation of the color that is shown on theuser interface display 40 or theexternal display 18. - The cropping/chroma controls 44 also allow the user to crop the captured video image such that it “fits” in the prerecorded background image. These controls may be used to zoom out or zoom in on a target subject in order to adjust the size of the target subject to be in proportion with other objects in the prerecorded scene on to which the target subject is superimposed.
- Other embodiments of the invention may contain other controls for modifying the captured video. For example, the
user interface 32 may comprise a color saturation control that adjusts the color level of the captured video. This allows for a color image to be adjusted so as to blend in a black and white background. - The
camera input display 46 identifies thevideo recorders 16 that are connected to thecompositor device 12 and that are available to capture video for processing. For example, ifmultiple video recorders 16 were coupled to thecompositor device 12, then multiple lights of thecamera input display 46 may be illuminated. In other embodiments of the invention, thecamera input display 46 identifies when the video being captured by one of thevideo recorders 16 is being processed and output to thedisplay 18. In other embodiments of the invention, theuser interface 32 does not include thecamera input display 46. - In one embodiment, the
user interface 32 also comprises controls that are generally found on CD/DVD players. For example, the user interface comprises apower button 48 to turn the machine on and off. The user interface also comprises DVD/CD controls 50, such as play, rewind, fast forward, stop, pause, eject and the like, that are used to control the operation of theDVD player 24. - In one embodiment, the
user interface 32 also includes a remote control input (not shown). The remote control input may accept instructions or data that are transmitted to theuser interface 32 from one or more remote control devices. These instructions may correspond to controls that are present on theuser interface 32 or may include more or fewer instructions that enable the user to manage the operation of thecompositor device 12. - Though
FIG. 3 depicts one implementation of theuser interface 32, other types of user interfaces may be used in other embodiments. For example, theuser interface 32 may comprise a touch screen that both displays information to a user and accepts input from the user. In other embodiments, theuser interface 32 may accept instructions through voice recognition or may be coupled to another system or device, such as a keyboard or personal computer, that accepts input from a user. In yet other embodiments, thecompositor device 12 operates without auser interface 32. In such embodiments, a user interface may be incorporated into thedisplay 18. For example,display 18 may show on-screen instructions to a user or accept commands inputted by the user. - As shown by the embodiment of the
compositor device 12 inFIG. 2 , thecontrol circuitry 20 manages the content of the signal outputted to thedisplay 18. In one embodiment, the output of theswitcher 30 is coupled to thecontrol circuitry 20. In such an embodiment, thecontrol circuitry 20 may select to output the signal from theswitcher 30 or may output other video content that is stored in thememory 22. For example, in one embodiment, prerecorded scenes are stored in thememory 22, and upon input from the user, thecontrol circuitry 20 outputs these scenes instead of the signal from theswitcher 30. In another embodiment, thecontrol circuitry 20 is used to overlay information such as subtitles or dialogue prompts over the signal from theswitcher 30. In yet other embodiments of the invention, the output of theswitcher 30 is coupled to thedisplay 18 without intervention by thecontrol circuitry 20. - In yet other embodiments of the invention, the
compositor device 12 includes input/output ports (not shown) that interface with external systems or devices. For example, thecompositor device 12 may comprise a serial port, a universal serial bus (USB), a firewire port, or the like that can be used to download data files from a personal computer or handheld device. -
FIG. 4 is a block diagram of an embodiment of the invention wherein thecompositor device 12 is structured to receive input from multiple video recorders. As can be seen inFIG. 4 , thecompositor device 12 includes the same components as the embodiment of thecompositor device 12 depicted inFIG. 2 . Thecompositor device 12 ofFIG. 4 additionally comprises a second chroma-key mixer 54 and asecond switcher 56. - In one embodiment, the second chroma-
key mixer 54 functions similarly to the first chroma-key mixer 28. In particular, the second chroma-key mixer 54 receives, as inputs, signals from themultiplexer 26, the first chroma-key mixer 28 and asecond video recorder 16. The second chroma-key mixer 54 may also receive instructions from thecontrol circuitry 20. Like the first chroma-key mixer 28, the second chroma-key mixer 54 removes the chroma element from a real-time image, such as captured by the second video recorder, and combines the modified real-time image with another signal to form a composite signal. In an embodiment having two video recorders, the second chroma-key mixer 54 may combine the real-time image captured by the second video recorder with either the prerecorded video from themultiplexer 26 or the composite signal outputted by the first chroma-key mixer 28. The second chroma-key mixer 54 then outputs a second composite signal to thesecond switcher 56. - In other embodiments, the second chroma-
key mixer 54 may be external to thecompositor device 12. In yet other embodiments, portion of the second chroma-key mixer 54 may be external to thecompositor device 12 and portions of the second chroma-key mixer 54 may be internal to thecompositor device 12. - The
second switcher 56 functions similarly to thefirst switcher 30. In one embodiment, thesecond switcher 56 receives, as inputs, signals from thefirst switcher 30 and from the second chroma-key mixer 54. Thesecond switcher 56 selects between these inputs based upon instructions received from thecontrol circuitry 20. Thesecond switcher 56 may output a signal to thecontrol circuitry 20 or to thedisplay 18. In one embodiment of the invention, the content of the output signal of thesecond switcher 56 may be: (1) the prerecorded signal from themultiplexer 26, (2) the composite signal from the first chroma-key mixer 28 having portions of an image from the first video recorder, (3) the second composite signal from the second chroma-key mixer 54 having portions of an image from the second video recorder, or (4) the second composite signal from the second chroma-key mixer 54 having portions of images from the first video recorder and from the second video recorder. - The operation of one embodiment of the video compositing system will now be described.
FIG. 5 illustrates one embodiment of an interactivevideo compositing process 100. Thecompositing process 100 begins withState 105. AtState 105, the user selects a video source that the user wants to use as a background image for a final composite image. For example, the user may insert a favorite movie into theDVD player 24. - After the user has selected a particular video source, the
compositing process 100 continues withState 110. AtState 110, data files associated with the video source are accessed by thecompositor device 12. For example, when a DVD is inserted into theDVD player 24, the serial number of the DVD is communicated to thecontrol circuitry 20, which uses the serial number to identify and access the appropriate prerecorded data files. The data files identify scenes recorded on the DVD that have been catalogued as being suitable for chroma key substitution. In one embodiment of the invention, the suitable scenes are identified by the data files stored in thememory 22 of thecompositor device 12. These data files may have been preprogrammed in thememory 22, may have been downloaded or saved from another source, or may be saved on the selected DVD, such as in the bonus materials section of the DVD. In one embodiment, the scenes catalogued as being available for chroma-key substitution generally contain video footage of a character for which a user can substitute his or her own image. In one embodiment, the data files comprise reference information that identifies which scenes of the DVD contain video footage of the particular character. - The available scenes are communicated to the user, such as through the
display 40 of theuser interface 32. In another embodiment, the available scenes may be communicated through an external display, such asdisplay 18. AtState 115, the user selects an available scene into which the user wants to superimpose or “insert” a real-time image. In one embodiment, the user makes the selection through theuser interface 32, such as through a remote control. - After the user selects an available scene, the
compositing process 100 proceeds withState 120. AtState 120, thevideo recorder 16 is used to capture a target image. This target image is the image that is used to overlay, or be inserted into, in real time, the scenes from the video source. The target image is preferably positioned in front of an evenly lit, solid colored background, which color represents the chroma element. For example, in one embodiment of the invention wherein the selected chroma element is green, the target image is positioned in front of a “green screen.” Other colors or types of backgrounds may be used that enable the background to be later “removed” when forming a composite image with scenes from the video source. - In one embodiment, a video camera is utilized to capture the target image and convert the image into a signal that is input into the chroma-
key mixer 28 of thecompositor device 12. It should be recognized that, along with a video image, an audio signal may also be recorded by the video camera and input into thecompositor device 12. - The chroma-
key mixer 28 of thecompositor device 12 creates a composite image through the processes that have been previously discussed. In one embodiment, the chroma-key mixer 28 removes the chroma element from the image captured by the video camera, leaving only the video image of the target image. Then, corresponding portions of the image from the video source are used to fill in the removed portions of the video recorded image, thus forming a composite image that appears to have the target image inserted into scenes from the video source. - In another embodiment, the chroma element portions of the video-captured image are made transparent, such as through digital processing, leaving only the target image. The modified video-captured image is then layered on top of the video source image. As a result, portions from the background video source image that are located underneath the top layer target image are not seen on the display of the composite image. However, portions of the background video source image located underneath the transparent portions of the top layer (those portions that had contained the chroma element) are viewable.
- The
compositing process 100 then moves toState 125. AtState 125, it is determined whether the video source image or the composite image is to be selected as the output image. In one embodiment, this selection of the output image is made by theswitcher 30. The control of this selection may be performed automatically (without user interaction) by thecompositor device 12 using information from the prerecorded data files, or the user may control the operation of theswitcher 30 through theuser interface 32. For example, thecontrol circuitry 20, based upon the beginning and ending reference points contained in the data files, may instruct theswitcher 30 when to output the video source image and when to output the composite image that has the target image overlaid on the video source image. In another embodiment, the user may also manually control when such switching occurs. Such manual control allows the user more freedom to create desired scenes or special effects. - If at
State 125, theswitcher 30 is instructed to select the composite image, thecompositing process 100 moves toState 130. AtState 130, the composite image is shown on thedisplay 18. Viewers of thedisplay 18 will observe the real-time target image inserted into the prerecorded footage from the video source. For example, viewers may see the target image replacing a character in a movie playing on theDVD player 24. AfterState 130, thecompositing process 100 moves toState 140. - If at
State 125, theswitcher 30 is instructed to select the video source image, thecompositing process 100 moves toState 135. AtState 135, the image from the video source is shown on thedisplay 18. AfterState 135, thecompositing process 100 then proceeds toState 140. - At
State 140, it is determined whether or not preprogrammed data is to be displayed instead of the video source image or the composite image. Such preprogrammed data may include, for example, prerecorded scenes that are stored in thememory 22. In one embodiment, the prerecorded scenes comprise video clips that users may want to insert in order make the displayed scenes appear more interactive or to appear more life-like. For example, prerecorded video clips having various forms of feedback from judges could be inserted after a target subject has acted out a scene (which was observed by the viewers of the display 18). In one embodiment, different video clips are selected to be displayed based on input given by the viewers. - If preprogrammed data is to be displayed, the
compositing process 100 moves toState 145. AtState 145, the preprogrammed data is communicated to thedisplay 18. In one embodiment, thecontrol circuitry 20 manages which signal is communicated to thedisplay 18. In another embodiment, a multiplexer or other similar device may be used to select which signal is output to thedisplay 18. The length of time that the preprogrammed data is displayed may be directly controlled by the user or may be a set length of time, such as, for example, until the end of the particular video clip. Upon completion ofState 145, thecompositing process 100 returns toState 140. If preprogrammed data is not to be displayed, thecompositing process 100 returns toState 125 to determine whether the video source image or the composite image is to be displayed. - One with skill in the art will recognize that the
compositing process 100 illustrated inFIG. 5 is only one example of the functioning of thecompositing system 10. For example, in other embodiments of the invention, thecompositing process 100 may include more or fewer states, or states may be combined or executed in a different order. For example, additional states may be added that illustrate the separate control of audio signals and video signals. In other embodiments, preprogrammed data may be displayed at the beginning of thecompositing process 100 or upon the selection of a particular video source for playback. - The
compositing system 10 may be particularly useful with preprogrammed video that is easily adapted to allow for user interaction. For example, to help further illustrate an embodiment of thecompositing process 100, assume that the user selects a DVD that has recorded scenes from the television talent show American Idol. Data files corresponding to video segments of the DVD are stored in thememory 22 of thecompositor device 12. Upon insertion of the DVD, the user is provided with options of scenes that are available for user interaction. For example, the user may have the option to select different scenes in which the user may “perform” in front of the judges or an audience. - Through the
user interface 32, the user selects a scene for video compositing. For example, the user may pick a scene in which a contestant is performing by signing a song in front of the judges. A user whose image is going to be substituted into the American Idol footage is positioned in front of thevideo recorder 16 with a green screen serving as the background. The individual then performs as if he or she was actually participating on the American Idol program, the performance of the individual being captured by thevideo recorder 16 and converted to a signal communicated to thecompositor device 12. - The
display 18 shows video and audio from the American Idol program. During the American Idol scenes that would normally contain footage of the participant on the actual program, the image of the individual being recorded by thevideo recorder 16 is substituted in real time for the participant. As a result, it appears to viewers of thedisplay 18 that the individual is actually participating on the American Idol program. The timing of the substitution of images is determined by the reference information recorded in the data files, which process has been described previously. In addition, the user may also manually control which images are output to thedisplay 18. - Audio signals that are captured by the
video recorder 16 are also output through thedisplay 18. Like the video images, the substitution of the real-time audio signals from the video recorded footage may occur at appropriate points in the American Idol scenes, such as when the participant is performing or singing. The audio substitution need not occur at the same times as the video substitution. For example, there may exist portions of American Idol footage that contain the voice of the participant but that do not contain the video image of the participant. Again, the substitution of the audio signals may be automatically controlled by thecompositor device 12 based on the data file information and/or may be manually controlled by the user. - After the performance, viewers have the option to rate the performance of the individual who has been inserted into the program. These viewer ratings may be used to select the display of prerecorded video clips having feedback from the judges on American Idol. For example, prerecorded video clips of good reviews, bad reviews, and average reviews may be stored in the
memory 22 of thecompositor device 12. The viewers then have the option of inputting their opinions of the performance, such as through remote controls communicating with theuser interface 32. If the viewers rate the performance by the individual as being generally poor, then thecompositor device 12 selects the playback of video clips that include the judges being critical of the performance. On the other hand, if the users rate the performance as being generally good, then thecompositor device 12 selects the playback of video clips that give positive feedback from the judges. - In other embodiments of the invention, the above-described American Idol program may be stored entirely in the
memory 22 of thecompositor device 12 without the use of a DVD. In yet other embodiments, thecompositor device 12 may include a monitor that display the appropriate voice prompts to the user, which prompts correspond to portions of the American Idol program. For example, thecompositor device 12 may include a CD+G input or player that reads files and outputs a display having voice prompts. - Although embodiments described herein have related generally to real-time video compositing with sources such as movies, television programs and the like, other embodiments of the invention may utilize any source of video, audio, or a combination thereof. For example, in another embodiment of the invention, the compositor device may accept video or audio input from a video game system. Such would allow a user to “insert” himself or herself into the video game and to interact with objects therein.
- While certain embodiments of the inventions have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the inventions.
Claims (20)
1. A method for coordinating the display of multiple signals, the method comprising:
accessing a first signal comprising video content having multiple frames;
accessing a second signal comprising a video image of a participant;
forming a combined video signal comprising at least a portion of the first signal and at least a portion of the second signal, wherein the combined video signal is configured to cause the display of the video image of the participant in place of a scene element of the video content;
accessing reference data that corresponds to the first signal, the reference data comprising first reference data that identifies a first frame of the multiple frames in which the scene element is associated with a scene of the video content, and wherein the reference data comprises second reference data that identifies a second frame of the multiple frames in which the scene element is no longer associated with the scene of the video content; and
switching between displaying the first signal and displaying the combined video signal, wherein a timing of said switching is based on the reference data such that the combined video signal is selected at a first time identified by the first reference data whereby the participant appears in the scene in place of the scene element, and such that the first signal is selected at a second time identified by the second reference data whereby the participant no longer appears in the scene of the video content.
2. The method of claim 1 , wherein the video image of the participant comprises a real-time image of the participant.
3. The method of claim 2 , wherein said switching is performed in real time.
4. The method of claim 1 , wherein said accessing the second signal comprises receiving the second signal via a network.
5. The method of claim 4 , further comprising accessing a third signal comprising a second video image of a second participant.
6. The method of claim 5 , wherein the combined video signal comprises at least portions of the first, second and third signals.
7. The method of claim 6 , wherein the reference data comprises third reference data that identifies a third frame of the multiple frames in which a second scene element is associated with the scene of the video content.
8. The method of claim 1 , wherein the first signal further comprises a first audio signal that corresponds to the video content.
9. The method of claim 8 , wherein the second signal comprises a second audio signal that corresponds to audio of the participant, wherein at the first time identified by the first reference data the second audio signal is associated with the combined video signal.
10. The method of claim 1 , wherein said forming the combined video signal comprises overlaying the at least the portion of the second signal on the first signal.
11. The method of claim 1 , wherein the scene element comprises an actor or actress of the video content.
12. A system for providing interactive entertainment, the system comprising:
a first input configured to receive a first signal, the first signal comprising video content including a plurality of frames representing a scene;
a second input configured to receive a second signal, the second signal comprising an image of a participant; and
a processor in communication with the first input and the second input, the processor being configured to combine at least a portion of the first signal and at least a portion of the second signal, wherein the portion of the second signal comprises the image of the participant,
the processor being further configured to receive first reference data that identifies a first frame of the plurality of frames in which a scene element appears in the scene of the video content and to receive second reference data that identifies a second frame of the plurality of frames in which the scene element no longer appears in the scene of the video content,
and the processor being further configured to output the combined signal at a first time identified by the first reference data such that the participant appears in the scene as the scene element and to output the first video signal at a second time identified by the second reference data such that the participant no longer appears in the scene.
13. The system of claim 12 , wherein the image comprises a real-time image of the participant.
14. The system of claim 12 , wherein the second input is configured to receive the second signal via a network.
15. The system of claim 12 , wherein the processor is further configured to adjust one or more attributes of the image of the participant when forming the combined signal.
16. The system of claim 15 , wherein the one or more attributes comprises a size of the image of the participant.
17. The system of claim 15 , wherein the one or more attributes comprises a color of the image of the participant.
18. The system of claim 12 , wherein the video content comprises at least one of movie content, video game content and television program content.
19. The system of claim 12 , wherein the scene element comprises a character of the video content.
20. A system for providing interactive entertainment, the system comprising:
means for receiving a first video signal, the first video signal comprising video content including a plurality of frames, the plurality of frames representing a scene;
means for receiving a second video signal, the second video signal comprising a video image of a participant; and
processing means for combining the first video signal and at least a portion of the second video signal, wherein the portion of the second video signal comprises the video image of the participant, and wherein the combined signal comprises a combined video signal with the participant as a character in the scene,
wherein the processing means further receives first reference data that identifies a first frame of the plurality of frames in which the character appears in the scene of the video content and receives second reference data that identifies a second frame of the plurality of frames in which the character exits the scene of the video content,
wherein the processing means further outputs the combined signal at a first time identified by the first reference data such that the participant appears in the scene as the character, and outputs the first video signal at a second time identified by the second reference data such that the participant no longer appears in the scene.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/256,374 US20090040385A1 (en) | 2003-05-02 | 2008-10-22 | Methods and systems for controlling video compositing in an interactive entertainment system |
US12/482,834 US7649571B2 (en) | 2003-05-02 | 2009-06-11 | Methods for interactive video compositing |
US12/482,785 US7646434B2 (en) | 2003-05-02 | 2009-06-11 | Video compositing systems for providing interactive entertainment |
US12/903,973 US20110025918A1 (en) | 2003-05-02 | 2010-10-13 | Methods and systems for controlling video compositing in an interactive entertainment system |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US46748003P | 2003-05-02 | 2003-05-02 | |
US10/836,729 US7528890B2 (en) | 2003-05-02 | 2004-04-30 | Interactive system and method for video compositing |
US12/256,374 US20090040385A1 (en) | 2003-05-02 | 2008-10-22 | Methods and systems for controlling video compositing in an interactive entertainment system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/836,729 Continuation US7528890B2 (en) | 2003-05-02 | 2004-04-30 | Interactive system and method for video compositing |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/482,834 Continuation US7649571B2 (en) | 2003-05-02 | 2009-06-11 | Methods for interactive video compositing |
US12/482,785 Continuation US7646434B2 (en) | 2003-05-02 | 2009-06-11 | Video compositing systems for providing interactive entertainment |
US12/903,973 Continuation US20110025918A1 (en) | 2003-05-02 | 2010-10-13 | Methods and systems for controlling video compositing in an interactive entertainment system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090040385A1 true US20090040385A1 (en) | 2009-02-12 |
Family
ID=33435079
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/836,729 Active 2026-07-24 US7528890B2 (en) | 2003-05-02 | 2004-04-30 | Interactive system and method for video compositing |
US12/256,374 Abandoned US20090040385A1 (en) | 2003-05-02 | 2008-10-22 | Methods and systems for controlling video compositing in an interactive entertainment system |
US12/256,312 Abandoned US20090041422A1 (en) | 2003-05-02 | 2008-10-22 | Methods and systems for controlling video compositing in an interactive entertainment system |
US12/482,834 Expired - Lifetime US7649571B2 (en) | 2003-05-02 | 2009-06-11 | Methods for interactive video compositing |
US12/482,785 Expired - Lifetime US7646434B2 (en) | 2003-05-02 | 2009-06-11 | Video compositing systems for providing interactive entertainment |
US12/903,973 Abandoned US20110025918A1 (en) | 2003-05-02 | 2010-10-13 | Methods and systems for controlling video compositing in an interactive entertainment system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/836,729 Active 2026-07-24 US7528890B2 (en) | 2003-05-02 | 2004-04-30 | Interactive system and method for video compositing |
Family Applications After (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/256,312 Abandoned US20090041422A1 (en) | 2003-05-02 | 2008-10-22 | Methods and systems for controlling video compositing in an interactive entertainment system |
US12/482,834 Expired - Lifetime US7649571B2 (en) | 2003-05-02 | 2009-06-11 | Methods for interactive video compositing |
US12/482,785 Expired - Lifetime US7646434B2 (en) | 2003-05-02 | 2009-06-11 | Video compositing systems for providing interactive entertainment |
US12/903,973 Abandoned US20110025918A1 (en) | 2003-05-02 | 2010-10-13 | Methods and systems for controlling video compositing in an interactive entertainment system |
Country Status (6)
Country | Link |
---|---|
US (6) | US7528890B2 (en) |
EP (1) | EP1621010A1 (en) |
JP (1) | JP4855930B2 (en) |
AU (1) | AU2004237705B2 (en) |
CA (1) | CA2523680C (en) |
WO (1) | WO2004100535A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070204295A1 (en) * | 2006-02-24 | 2007-08-30 | Orion Electric Co., Ltd. | Digital broadcast receiver |
US20080252784A1 (en) * | 2007-04-16 | 2008-10-16 | General Electric Company | Systems and methods for multi-source video distribution and composite display |
US20090041422A1 (en) * | 2003-05-02 | 2009-02-12 | Megamedia, Llc | Methods and systems for controlling video compositing in an interactive entertainment system |
US20100031149A1 (en) * | 2008-07-01 | 2010-02-04 | Yoostar Entertainment Group, Inc. | Content preparation systems and methods for interactive video systems |
US20100142928A1 (en) * | 2005-08-06 | 2010-06-10 | Quantum Signal, Llc | Overlaying virtual content onto video stream of people within venue based on analysis of the people within the video stream |
US20130074119A1 (en) * | 2011-09-19 | 2013-03-21 | Myung Ryul Choi | Multimedia apparatus and method for controlling multimedia apparatus |
US20130235059A1 (en) * | 2012-03-08 | 2013-09-12 | Mitsubishi Electric Corporation | Image compositing apparatus |
US10332560B2 (en) | 2013-05-06 | 2019-06-25 | Noo Inc. | Audio-video compositing and effects |
Families Citing this family (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005105449A1 (en) * | 2004-05-04 | 2005-11-10 | Sys Tec, S.R.L. | Method and machine for aligning flexographic printing plates on printing cylinders |
US20060055707A1 (en) * | 2004-09-10 | 2006-03-16 | Fayan Randy M | Graphical user interface for a keyer |
US20060136979A1 (en) * | 2004-11-04 | 2006-06-22 | Staker Allan R | Apparatus and methods for encoding data for video compositing |
JP3110010U (en) * | 2005-01-25 | 2005-06-09 | 船井電機株式会社 | Video cassette recorder integrated optical disk device and video signal device |
US20080028422A1 (en) * | 2005-07-01 | 2008-01-31 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Implementation of media content alteration |
US9230601B2 (en) | 2005-07-01 | 2016-01-05 | Invention Science Fund I, Llc | Media markup system for content alteration in derivative works |
US9092928B2 (en) | 2005-07-01 | 2015-07-28 | The Invention Science Fund I, Llc | Implementing group content substitution in media works |
US9065979B2 (en) | 2005-07-01 | 2015-06-23 | The Invention Science Fund I, Llc | Promotional placement in media works |
US8126190B2 (en) | 2007-01-31 | 2012-02-28 | The Invention Science Fund I, Llc | Targeted obstrufication of an image |
US8203609B2 (en) | 2007-01-31 | 2012-06-19 | The Invention Science Fund I, Llc | Anonymization pursuant to a broadcasted policy |
US20070005651A1 (en) | 2005-07-01 | 2007-01-04 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Restoring modified assets |
US8910033B2 (en) | 2005-07-01 | 2014-12-09 | The Invention Science Fund I, Llc | Implementing group content substitution in media works |
US20090037278A1 (en) * | 2005-07-01 | 2009-02-05 | Searete Llc, A Limited Liability Corporation Of The State Of Delaware | Implementing visual substitution options in media works |
US9583141B2 (en) | 2005-07-01 | 2017-02-28 | Invention Science Fund I, Llc | Implementing audio substitution options in media works |
ES2674897T3 (en) * | 2005-07-18 | 2018-07-04 | Thomson Licensing | Method and device to handle multiple video streams using metadata |
US7477264B2 (en) * | 2005-08-12 | 2009-01-13 | Microsoft Corporation | Compositing external images into a multimedia rendering pipeline |
GB0525789D0 (en) * | 2005-12-19 | 2006-01-25 | Landesburg Andrew | Live performance entertainment apparatus and method |
US8015200B2 (en) * | 2005-12-24 | 2011-09-06 | Phil Seiflein | Multimedia platform synchronizer |
KR100843074B1 (en) * | 2006-03-14 | 2008-07-02 | 삼성전자주식회사 | Image output apparatus and method using numbers of chroma key color |
US9031381B2 (en) * | 2006-07-20 | 2015-05-12 | Panopto, Inc. | Systems and methods for generation of composite video from multiple asynchronously recorded input streams |
US20080059999A1 (en) * | 2006-08-29 | 2008-03-06 | John Winans | Multi-function display controller |
US9215512B2 (en) | 2007-04-27 | 2015-12-15 | Invention Science Fund I, Llc | Implementation of media content alteration |
AU2008202703B2 (en) * | 2007-06-20 | 2012-03-08 | Mcomms Design Pty Ltd | Apparatus and method for providing multimedia content |
KR100894055B1 (en) * | 2007-07-06 | 2009-04-20 | 드리머 | Media reproduction apparatus and method for receiving multimedia contents using the same |
KR100935862B1 (en) * | 2007-07-06 | 2010-01-07 | 드리머 | System for providing contents based on media reproduction apparatus |
US8169449B2 (en) * | 2007-10-19 | 2012-05-01 | Qnx Software Systems Limited | System compositing images from multiple applications |
US8135724B2 (en) * | 2007-11-29 | 2012-03-13 | Sony Corporation | Digital media recasting |
US8082592B2 (en) | 2008-01-12 | 2011-12-20 | Harris Technology, Llc | Read/write encrypted media and method of playing |
US8121409B2 (en) | 2008-02-26 | 2012-02-21 | Cyberlink Corp. | Method for handling static text and logos in stabilized images |
JP5207860B2 (en) * | 2008-07-14 | 2013-06-12 | パナソニック株式会社 | Video / audio playback apparatus and video / audio playback method |
US20100209073A1 (en) * | 2008-09-18 | 2010-08-19 | Dennis Fountaine | Interactive Entertainment System for Recording Performance |
JP4670938B2 (en) * | 2008-11-04 | 2011-04-13 | ソニー株式会社 | Video signal processing apparatus and video signal processing method |
US8135260B1 (en) * | 2008-12-05 | 2012-03-13 | Marcus Risso | Video generation system |
US20100153847A1 (en) * | 2008-12-17 | 2010-06-17 | Sony Computer Entertainment America Inc. | User deformation of movie character images |
JP2010217679A (en) * | 2009-03-18 | 2010-09-30 | Takeshi Ito | Video display method |
US8477149B2 (en) * | 2009-04-01 | 2013-07-02 | University Of Central Florida Research Foundation, Inc. | Real-time chromakey matting using image statistics |
US8248533B2 (en) * | 2009-11-19 | 2012-08-21 | Crucs Holdings, Llc | Coordinated video for television display |
US20110170008A1 (en) * | 2010-01-13 | 2011-07-14 | Koch Terry W | Chroma-key image animation tool |
US8508614B2 (en) * | 2010-06-02 | 2013-08-13 | Futurity Ventures LLC | Teleprompting system and method |
US8839118B2 (en) * | 2010-06-30 | 2014-09-16 | Verizon Patent And Licensing Inc. | Users as actors in content |
US20120017150A1 (en) * | 2010-07-15 | 2012-01-19 | MySongToYou, Inc. | Creating and disseminating of user generated media over a network |
US8296456B2 (en) | 2010-09-03 | 2012-10-23 | United Video Properties, Inc. | Systems and methods for displaying personalized media content |
WO2012030584A2 (en) * | 2010-09-03 | 2012-03-08 | United Video Properties, Inc. | Systems and methods for displaying personalized media content |
WO2012036647A1 (en) * | 2010-09-15 | 2012-03-22 | Panchenko Borys Evgenijovych | Method for automating digital multi-program multi-signal switching |
US9363448B2 (en) | 2011-06-02 | 2016-06-07 | Touchcast, Llc | System and method for providing and interacting with coordinated presentations |
TWI474201B (en) * | 2012-10-17 | 2015-02-21 | Inst Information Industry | Construction system scene fragment, method and recording medium |
US9686524B1 (en) | 2013-03-15 | 2017-06-20 | Tribune Broadcasting Company, Llc | Systems and methods for playing a video clip of an encoded video file |
US9451231B1 (en) * | 2013-03-15 | 2016-09-20 | Tribune Broadcasting Company, Llc | Systems and methods for switching between multiple software video players linked to a single output |
US11488363B2 (en) | 2019-03-15 | 2022-11-01 | Touchcast, Inc. | Augmented reality conferencing system and method |
US11659138B1 (en) | 2013-06-26 | 2023-05-23 | Touchcast, Inc. | System and method for interactive video conferencing |
US10297284B2 (en) * | 2013-06-26 | 2019-05-21 | Touchcast LLC | Audio/visual synching system and method |
US9852764B2 (en) | 2013-06-26 | 2017-12-26 | Touchcast LLC | System and method for providing and interacting with coordinated presentations |
US10757365B2 (en) | 2013-06-26 | 2020-08-25 | Touchcast LLC | System and method for providing and interacting with coordinated presentations |
US9787945B2 (en) | 2013-06-26 | 2017-10-10 | Touchcast LLC | System and method for interactive video conferencing |
US11405587B1 (en) | 2013-06-26 | 2022-08-02 | Touchcast LLC | System and method for interactive video conferencing |
US9661256B2 (en) | 2014-06-26 | 2017-05-23 | Touchcast LLC | System and method for providing and interacting with coordinated presentations |
US10075676B2 (en) | 2013-06-26 | 2018-09-11 | Touchcast LLC | Intelligent virtual assistant system and method |
US10523899B2 (en) | 2013-06-26 | 2019-12-31 | Touchcast LLC | System and method for providing and interacting with coordinated presentations |
US9666231B2 (en) | 2014-06-26 | 2017-05-30 | Touchcast LLC | System and method for providing and interacting with coordinated presentations |
US10356363B2 (en) | 2013-06-26 | 2019-07-16 | Touchcast LLC | System and method for interactive video conferencing |
US10084849B1 (en) | 2013-07-10 | 2018-09-25 | Touchcast LLC | System and method for providing and interacting with coordinated presentations |
US9761053B2 (en) | 2013-08-21 | 2017-09-12 | Nantmobile, Llc | Chroma key content management systems and methods |
US10182187B2 (en) | 2014-06-16 | 2019-01-15 | Playvuu, Inc. | Composing real-time processed video content with a mobile device |
US10255251B2 (en) | 2014-06-26 | 2019-04-09 | Touchcast LLC | System and method for providing and interacting with coordinated presentations |
US12079642B2 (en) * | 2016-10-31 | 2024-09-03 | Ati Technologies Ulc | Method and apparatus for dynamically reducing application render-to-on screen time in a desktop environment |
WO2019099737A2 (en) * | 2017-11-15 | 2019-05-23 | Touchcast LLC | Audio/visual synching system and method |
US10674111B2 (en) * | 2017-12-11 | 2020-06-02 | Disney Enterprises, Inc. | Systems and methods for profile based media segment rendering |
CN109194644A (en) * | 2018-08-29 | 2019-01-11 | 北京达佳互联信息技术有限公司 | Sharing method, device, server and the storage medium of network works |
CN118072409B (en) * | 2023-11-08 | 2024-11-08 | 北京京能招标集采中心有限责任公司 | Automatic generation method of evaluation mark video |
Citations (90)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4122490A (en) * | 1976-11-09 | 1978-10-24 | Lish Charles A | Digital chroma-key circuitry |
US4357624A (en) * | 1979-05-15 | 1982-11-02 | Combined Logic Company | Interactive video production system |
US4521014A (en) * | 1982-09-30 | 1985-06-04 | Sitrick David H | Video game including user visual image |
US4599611A (en) * | 1982-06-02 | 1986-07-08 | Digital Equipment Corporation | Interactive computer-based information display system |
US4688105A (en) * | 1985-05-10 | 1987-08-18 | Bloch Arthur R | Video recording system |
US4710873A (en) * | 1982-07-06 | 1987-12-01 | Marvin Glass & Associates | Video game incorporating digitized images of being into game graphics |
US4800432A (en) * | 1986-10-24 | 1989-01-24 | The Grass Valley Group, Inc. | Video Difference key generator |
US4827344A (en) * | 1985-02-28 | 1989-05-02 | Intel Corporation | Apparatus for inserting part of one video image into another video image |
US4891748A (en) * | 1986-05-30 | 1990-01-02 | Mann Ralph V | System and method for teaching physical skills |
US4968132A (en) * | 1989-05-24 | 1990-11-06 | Bran Ferren | Traveling matte extraction system |
US5099337A (en) * | 1989-10-31 | 1992-03-24 | Cury Brian L | Method and apparatus for producing customized video recordings |
US5149967A (en) * | 1989-10-13 | 1992-09-22 | Hitachi, Ltd. | Charged-particle beam apparatus |
US5151793A (en) * | 1990-02-26 | 1992-09-29 | Pioneer Electronic Corporation | Recording medium playing apparatus |
US5184295A (en) * | 1986-05-30 | 1993-02-02 | Mann Ralph V | System and method for teaching physical skills |
US5264933A (en) * | 1991-07-19 | 1993-11-23 | Princeton Electronic Billboard, Inc. | Television displays having selected inserted indicia |
US5381184A (en) * | 1991-12-30 | 1995-01-10 | U.S. Philips Corporation | Method of and arrangement for inserting a background signal into parts of a foreground signal fixed by a predetermined key color |
US5428401A (en) * | 1991-05-09 | 1995-06-27 | Quantel Limited | Improvements in or relating to video image keying systems and methods |
US5500684A (en) * | 1993-12-10 | 1996-03-19 | Matsushita Electric Industrial Co., Ltd. | Chroma-key live-video compositing circuit |
US5553864A (en) * | 1992-05-22 | 1996-09-10 | Sitrick; David H. | User image integration into audiovisual presentation system and methodology |
US5566251A (en) * | 1991-09-18 | 1996-10-15 | David Sarnoff Research Center, Inc | Video merging employing pattern-key insertion |
US5681223A (en) * | 1993-08-20 | 1997-10-28 | Inventures Inc | Training video method and display |
US5751337A (en) * | 1994-09-19 | 1998-05-12 | Telesuite Corporation | Teleconferencing method and system for providing face-to-face, non-animated teleconference environment |
US5830065A (en) * | 1992-05-22 | 1998-11-03 | Sitrick; David H. | User image integration into audiovisual presentation system and methodology |
US5953076A (en) * | 1995-06-16 | 1999-09-14 | Princeton Video Image, Inc. | System and method of real time insertions into video using adaptive occlusion with a synthetic reference image |
US5982350A (en) * | 1991-10-07 | 1999-11-09 | Eastman Kodak Company | Compositer interface for arranging the components of special effects for a motion picture production |
US6061532A (en) * | 1995-02-24 | 2000-05-09 | Eastman Kodak Company | Animated image presentations with personalized digitized images |
US6072933A (en) * | 1995-03-06 | 2000-06-06 | Green; David | System for producing personalized video recordings |
US6086380A (en) * | 1998-08-20 | 2000-07-11 | Chu; Chia Chen | Personalized karaoke recording studio |
US6115052A (en) * | 1998-02-12 | 2000-09-05 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | System for reconstructing the 3-dimensional motions of a human figure from a monocularly-viewed image sequence |
US6122013A (en) * | 1994-04-29 | 2000-09-19 | Orad, Inc. | Chromakeying system |
US6126449A (en) * | 1999-03-25 | 2000-10-03 | Swing Lab | Interactive motion training device and method |
US6141060A (en) * | 1996-10-22 | 2000-10-31 | Fox Sports Productions, Inc. | Method and apparatus for adding a graphic indication of a first down to a live video of a football game |
US6144755A (en) * | 1996-10-11 | 2000-11-07 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | Method and apparatus for determining poses |
US6266452B1 (en) * | 1999-03-18 | 2001-07-24 | Nec Research Institute, Inc. | Image registration method |
US6285408B1 (en) * | 1998-04-09 | 2001-09-04 | Lg Electronics Inc. | Digital audio/video system and method integrates the operations of several digital devices into one simplified system |
US6283858B1 (en) * | 1997-02-25 | 2001-09-04 | Bgk International Incorporated | Method for manipulating images |
US6314197B1 (en) * | 1997-08-22 | 2001-11-06 | International Business Machines Corporation | Determining an alignment estimation between two (fingerprint) images |
US6335765B1 (en) * | 1999-11-08 | 2002-01-01 | Weather Central, Inc. | Virtual presentation system and method |
US6350199B1 (en) * | 1999-03-16 | 2002-02-26 | International Game Technology | Interactive gaming machine and method with customized game screen presentation |
US6351265B1 (en) * | 1993-10-15 | 2002-02-26 | Personalized Online Photo Llc | Method and apparatus for producing an electronic image |
US6384821B1 (en) * | 1999-10-04 | 2002-05-07 | International Business Machines Corporation | Method and apparatus for delivering 3D graphics in a networked environment using transparent video |
US20020130889A1 (en) * | 2000-07-18 | 2002-09-19 | David Blythe | System, method, and computer program product for real time transparency-based compositing |
US20030007700A1 (en) * | 2001-07-03 | 2003-01-09 | Koninklijke Philips Electronics N.V. | Method and apparatus for interleaving a user image in an original image sequence |
US6522787B1 (en) * | 1995-07-10 | 2003-02-18 | Sarnoff Corporation | Method and system for rendering and combining images to form a synthesized view of a scene containing image information from a second image |
US20030051255A1 (en) * | 1993-10-15 | 2003-03-13 | Bulman Richard L. | Object customization and presentation system |
US6535269B2 (en) * | 2000-06-30 | 2003-03-18 | Gary Sherman | Video karaoke system and method of use |
US20030108329A1 (en) * | 2001-12-12 | 2003-06-12 | Meric Adriansen | Advertising method and system |
US6624853B1 (en) * | 1998-03-20 | 2003-09-23 | Nurakhmed Nurislamovich Latypov | Method and system for creating video programs with interaction of an actor with objects of a virtual space and the objects to one another |
US20040100581A1 (en) * | 2002-11-27 | 2004-05-27 | Princeton Video Image, Inc. | System and method for inserting live video into pre-produced video |
US6750919B1 (en) * | 1998-01-23 | 2004-06-15 | Princeton Video Image, Inc. | Event linked insertion of indicia into video |
US6763148B1 (en) * | 2000-11-13 | 2004-07-13 | Visual Key, Inc. | Image recognition methods |
US6771303B2 (en) * | 2002-04-23 | 2004-08-03 | Microsoft Corporation | Video-teleconferencing system with eye-gaze correction |
US20040152058A1 (en) * | 2002-06-11 | 2004-08-05 | Browne H. Lee | Video instructional system and method for teaching motor skills |
US6807290B2 (en) * | 2000-03-09 | 2004-10-19 | Microsoft Corporation | Rapid computer modeling of faces for animation |
US6816159B2 (en) * | 2001-12-10 | 2004-11-09 | Christine M. Solazzi | Incorporating a personalized wireframe image in a computer software application |
US6835887B2 (en) * | 1996-09-26 | 2004-12-28 | John R. Devecka | Methods and apparatus for providing an interactive musical game |
US20050028193A1 (en) * | 2002-01-02 | 2005-02-03 | Candelore Brant L. | Macro-block based content replacement by PID mapping |
US6873724B2 (en) * | 2001-08-08 | 2005-03-29 | Mitsubishi Electric Research Laboratories, Inc. | Rendering deformable 3D models recovered from videos |
US6881067B2 (en) * | 1999-01-05 | 2005-04-19 | Personal Pro, Llc | Video instructional system and method for teaching motor skills |
US20050151743A1 (en) * | 2000-11-27 | 2005-07-14 | Sitrick David H. | Image tracking and substitution system and methodology for audio-visual presentations |
US6919892B1 (en) * | 2002-08-14 | 2005-07-19 | Avaworks, Incorporated | Photo realistic talking head creation system and method |
US6937295B2 (en) * | 2001-05-07 | 2005-08-30 | Junaid Islam | Realistic replication of a live performance at remote locations |
US6950542B2 (en) * | 2000-09-26 | 2005-09-27 | Koninklijke Philips Electronics, N.V. | Device and method of computing a transformation linking two images |
US20050215319A1 (en) * | 2004-03-23 | 2005-09-29 | Harmonix Music Systems, Inc. | Method and apparatus for controlling a three-dimensional character in a three-dimensional gaming environment |
US6954498B1 (en) * | 2000-10-24 | 2005-10-11 | Objectvideo, Inc. | Interactive video manipulation |
US7015978B2 (en) * | 1999-12-13 | 2006-03-21 | Princeton Video Image, Inc. | System and method for real time insertion into video with occlusion on areas containing multiple colors |
US7027054B1 (en) * | 2002-08-14 | 2006-04-11 | Avaworks, Incorporated | Do-it-yourself photo realistic talking head creation system and method |
US7034537B2 (en) * | 2001-03-14 | 2006-04-25 | Hitachi Medical Corporation | MRI apparatus correcting vibratory static magnetic field fluctuations, by utilizing the static magnetic fluctuation itself |
US20060125962A1 (en) * | 2003-02-11 | 2006-06-15 | Shelton Ian R | Apparatus and methods for handling interactive applications in broadcast networks |
US20060136979A1 (en) * | 2004-11-04 | 2006-06-22 | Staker Allan R | Apparatus and methods for encoding data for video compositing |
US7079176B1 (en) * | 1991-11-25 | 2006-07-18 | Actv, Inc. | Digital interactive system for providing full interactivity with live programming events |
US7106906B2 (en) * | 2000-03-06 | 2006-09-12 | Canon Kabushiki Kaisha | Moving image generation apparatus, moving image playback apparatus, their control method, and storage medium |
US7116330B2 (en) * | 2001-02-28 | 2006-10-03 | Intel Corporation | Approximating motion using a three-dimensional model |
US7123779B2 (en) * | 2000-06-02 | 2006-10-17 | Koninklijke Philips Electronics, N.V. | Method and apparatus for merging images into a composite image |
US7137892B2 (en) * | 1992-05-22 | 2006-11-21 | Sitrick David H | System and methodology for mapping and linking based user image integration |
US20060262696A1 (en) * | 2003-08-20 | 2006-11-23 | Woerlee Pierre H | Method and device for recording information on a multilayer information carrier |
US7161614B1 (en) * | 1999-11-26 | 2007-01-09 | Sanyo Electric Co., Ltd. | Device and method for converting two-dimensional video to three-dimensional video |
US7181081B2 (en) * | 2001-05-04 | 2007-02-20 | Legend Films Inc. | Image sequence enhancement system and method |
US7209181B2 (en) * | 2000-03-08 | 2007-04-24 | Mitchell Kriegman | System and method for compositing of two or more real images in a cinematographic puppetry production |
US7221395B2 (en) * | 2000-03-14 | 2007-05-22 | Fuji Photo Film Co., Ltd. | Digital camera and method for compositing images |
US7230653B1 (en) * | 1999-11-08 | 2007-06-12 | Vistas Unlimited | Method and apparatus for real time insertion of images into video |
US7268834B2 (en) * | 2003-02-05 | 2007-09-11 | Axis, Ab | Method and apparatus for combining video signals to one comprehensive video signal |
US7285047B2 (en) * | 2003-10-17 | 2007-10-23 | Hewlett-Packard Development Company, L.P. | Method and system for real-time rendering within a gaming environment |
US7319493B2 (en) * | 2003-03-25 | 2008-01-15 | Yamaha Corporation | Apparatus and program for setting video processing parameters |
US7324166B1 (en) * | 2003-11-14 | 2008-01-29 | Contour Entertainment Inc | Live actor integration in pre-recorded well known video |
US7400752B2 (en) * | 2002-02-21 | 2008-07-15 | Alcon Manufacturing, Ltd. | Video overlay system for surgical apparatus |
US20090041422A1 (en) * | 2003-05-02 | 2009-02-12 | Megamedia, Llc | Methods and systems for controlling video compositing in an interactive entertainment system |
US7495689B2 (en) * | 2002-01-15 | 2009-02-24 | Pelco, Inc. | Multiple simultaneous language display system and method |
US7559841B2 (en) * | 2004-09-02 | 2009-07-14 | Sega Corporation | Pose detection method, video game apparatus, pose detection program, and computer-readable medium containing computer program |
US7697787B2 (en) * | 2002-06-06 | 2010-04-13 | Accenture Global Services Gmbh | Dynamic replacement of the face of an actor in a video movie |
Family Cites Families (52)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60190078A (en) | 1984-03-12 | 1985-09-27 | Fuji Photo Film Co Ltd | Picture synthesizing device |
JPS63178677A (en) | 1987-01-19 | 1988-07-22 | Nippon Telegr & Teleph Corp <Ntt> | Object image extracting device |
JPH01228288A (en) | 1988-03-08 | 1989-09-12 | Fujitsu Ltd | Picture processor |
JPH02127886A (en) | 1988-11-07 | 1990-05-16 | Nec Corp | Moving picture synthesizer and special effect device and video telephone set using same |
US5144454A (en) * | 1989-10-31 | 1992-09-01 | Cury Brian L | Method and apparatus for producing customized video recordings |
JPH03261279A (en) | 1990-03-12 | 1991-11-21 | Sanyo Electric Co Ltd | Video synthesizer |
JPH04220885A (en) | 1990-12-21 | 1992-08-11 | Nippon Telegr & Teleph Corp <Ntt> | Background elimination method and its execution device |
JP3036898B2 (en) | 1991-06-28 | 2000-04-24 | 日本放送協会 | Apparatus and method for generating key signal for image synthesis |
US5249967A (en) * | 1991-07-12 | 1993-10-05 | George P. O'Leary | Sports technique video training device |
US5861881A (en) * | 1991-11-25 | 1999-01-19 | Actv, Inc. | Interactive computer system for providing an interactive presentation with personalized video, audio and graphics responses for multiple viewers |
JPH05284522A (en) | 1992-04-03 | 1993-10-29 | Nippon Telegr & Teleph Corp <Ntt> | Video signal mixing processing method |
JP3261279B2 (en) | 1995-05-15 | 2002-02-25 | 松下電工株式会社 | Opening / closing device for receipt input port in courier service stamping device |
JPH09219836A (en) * | 1996-02-14 | 1997-08-19 | Matsushita Electric Ind Co Ltd | Image information recording method and image compositing device |
US6400374B2 (en) * | 1996-09-18 | 2002-06-04 | Eyematic Interfaces, Inc. | Video superposition system and method |
JPH10150585A (en) * | 1996-11-19 | 1998-06-02 | Sony Corp | Image compositing device |
US6072537A (en) * | 1997-01-06 | 2000-06-06 | U-R Star Ltd. | Systems for producing personalized video clips |
US5764306A (en) * | 1997-03-18 | 1998-06-09 | The Metaphor Group | Real-time method of digitally altering a video data stream to remove portions of the original image and substitute elements to create a new image |
IL123733A0 (en) | 1997-03-27 | 1998-10-30 | Rt Set Ltd | Method for compositing an image of a real object with a virtual scene |
JPH1169228A (en) | 1997-08-19 | 1999-03-09 | Brother Ind Ltd | Music sound reproducing machine |
JPH11252459A (en) | 1998-02-26 | 1999-09-17 | Toshiba Corp | Image compositing device |
TW468021B (en) * | 1998-03-27 | 2001-12-11 | Mitsubishi Heavy Ind Ltd | Ash melting furnace and ash melting method thereof |
JP2000209500A (en) * | 1999-01-14 | 2000-07-28 | Daiichikosho Co Ltd | Method for synthesizing portrait image separately photographed with recorded background video image and for outputting the synthesized image for display and karaoke machine adopting this method |
JP2002007014A (en) * | 2000-06-19 | 2002-01-11 | Yamaha Corp | Information processor and musical instrument provided with the information processor |
CN1383543A (en) * | 2000-06-20 | 2002-12-04 | 皇家菲利浦电子有限公司 | Karaoka system |
JP3667217B2 (en) | 2000-09-01 | 2005-07-06 | 日本電信電話株式会社 | System and method for supplying advertisement information in video, and recording medium recording this program |
JP2002232783A (en) * | 2001-02-06 | 2002-08-16 | Sony Corp | Image processor, method therefor and record medium for program |
JP2002281465A (en) | 2001-03-16 | 2002-09-27 | Matsushita Electric Ind Co Ltd | Security protection processor |
US7034833B2 (en) * | 2002-05-29 | 2006-04-25 | Intel Corporation | Animated photographs |
US7053915B1 (en) * | 2002-07-30 | 2006-05-30 | Advanced Interfaces, Inc | Method and system for enhancing virtual stage experience |
JP4220885B2 (en) | 2003-11-11 | 2009-02-04 | アスクル株式会社 | Proper order quantity calculation method, proper order quantity calculation system, proper order quantity calculation program |
GB2412802A (en) | 2004-02-05 | 2005-10-05 | Sony Uk Ltd | System and method for providing customised audio/video sequences |
JP4533791B2 (en) * | 2005-04-19 | 2010-09-01 | 株式会社日立製作所 | Information browsing device |
US20070064125A1 (en) * | 2005-09-16 | 2007-03-22 | Richard Didow | Chroma-key event photography |
US20070064126A1 (en) * | 2005-09-16 | 2007-03-22 | Richard Didow | Chroma-key event photography |
US20070064120A1 (en) * | 2005-09-16 | 2007-03-22 | Richard Didow | Chroma-key event photography |
JP4982065B2 (en) * | 2005-09-26 | 2012-07-25 | 株式会社東芝 | Video content display system, video content display method and program thereof |
US7966577B2 (en) * | 2005-10-11 | 2011-06-21 | Apple Inc. | Multimedia control center |
US20070122786A1 (en) * | 2005-11-29 | 2007-05-31 | Broadcom Corporation | Video karaoke system |
US7720283B2 (en) * | 2005-12-09 | 2010-05-18 | Microsoft Corporation | Background removal in a live video |
GB0525789D0 (en) | 2005-12-19 | 2006-01-25 | Landesburg Andrew | Live performance entertainment apparatus and method |
US20080059896A1 (en) * | 2006-08-30 | 2008-03-06 | Microsoft Corporation | Mobile Device User Interface |
US7581186B2 (en) * | 2006-09-11 | 2009-08-25 | Apple Inc. | Media manager with integrated browsers |
US20090059094A1 (en) * | 2007-09-04 | 2009-03-05 | Samsung Techwin Co., Ltd. | Apparatus and method for overlaying image in video presentation system having embedded operating system |
US8482635B2 (en) * | 2007-09-12 | 2013-07-09 | Popnoggins, Llc | System, apparatus, software and process for integrating video images |
US8560337B2 (en) * | 2007-09-28 | 2013-10-15 | Cerner Innovation, Inc. | User interface for generating and managing medication tapers |
US8177611B2 (en) * | 2007-12-21 | 2012-05-15 | Sony Computer Entertainment America Llc | Scheme for inserting a mimicked performance into a scene and providing an evaluation of same |
US8184141B2 (en) * | 2008-02-04 | 2012-05-22 | Siemens Enterprise Communications, Inc. | Method and apparatus for face recognition enhanced video mixing |
US9813671B2 (en) * | 2008-02-04 | 2017-11-07 | Unify Gmbh & Co. Kg | Method and apparatus for enhanced video mixing |
US8151215B2 (en) * | 2008-02-07 | 2012-04-03 | Sony Corporation | Favorite GUI for TV |
WO2009101153A2 (en) * | 2008-02-13 | 2009-08-20 | Ubisoft Entertainment S.A. | Live-action image capture |
US8515253B2 (en) * | 2008-02-15 | 2013-08-20 | Sony Computer Entertainment America Llc | System and method for automated creation of video game highlights |
WO2010002921A1 (en) * | 2008-07-01 | 2010-01-07 | Yoostar Entertainment Group, Inc. | Interactive systems and methods for video compositing |
-
2004
- 2004-04-30 US US10/836,729 patent/US7528890B2/en active Active
- 2004-04-30 CA CA2523680A patent/CA2523680C/en not_active Expired - Lifetime
- 2004-04-30 AU AU2004237705A patent/AU2004237705B2/en not_active Ceased
- 2004-04-30 WO PCT/US2004/013465 patent/WO2004100535A1/en active Application Filing
- 2004-04-30 EP EP04751057A patent/EP1621010A1/en not_active Withdrawn
- 2004-04-30 JP JP2006514184A patent/JP4855930B2/en not_active Expired - Lifetime
-
2008
- 2008-10-22 US US12/256,374 patent/US20090040385A1/en not_active Abandoned
- 2008-10-22 US US12/256,312 patent/US20090041422A1/en not_active Abandoned
-
2009
- 2009-06-11 US US12/482,834 patent/US7649571B2/en not_active Expired - Lifetime
- 2009-06-11 US US12/482,785 patent/US7646434B2/en not_active Expired - Lifetime
-
2010
- 2010-10-13 US US12/903,973 patent/US20110025918A1/en not_active Abandoned
Patent Citations (100)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4122490A (en) * | 1976-11-09 | 1978-10-24 | Lish Charles A | Digital chroma-key circuitry |
US4357624A (en) * | 1979-05-15 | 1982-11-02 | Combined Logic Company | Interactive video production system |
US4599611A (en) * | 1982-06-02 | 1986-07-08 | Digital Equipment Corporation | Interactive computer-based information display system |
US4710873A (en) * | 1982-07-06 | 1987-12-01 | Marvin Glass & Associates | Video game incorporating digitized images of being into game graphics |
US4521014A (en) * | 1982-09-30 | 1985-06-04 | Sitrick David H | Video game including user visual image |
US4827344A (en) * | 1985-02-28 | 1989-05-02 | Intel Corporation | Apparatus for inserting part of one video image into another video image |
US4688105A (en) * | 1985-05-10 | 1987-08-18 | Bloch Arthur R | Video recording system |
US4688105B1 (en) * | 1985-05-10 | 1992-07-14 | Short Takes Inc | |
US5184295A (en) * | 1986-05-30 | 1993-02-02 | Mann Ralph V | System and method for teaching physical skills |
US4891748A (en) * | 1986-05-30 | 1990-01-02 | Mann Ralph V | System and method for teaching physical skills |
US4800432A (en) * | 1986-10-24 | 1989-01-24 | The Grass Valley Group, Inc. | Video Difference key generator |
US4968132A (en) * | 1989-05-24 | 1990-11-06 | Bran Ferren | Traveling matte extraction system |
US5149967A (en) * | 1989-10-13 | 1992-09-22 | Hitachi, Ltd. | Charged-particle beam apparatus |
US5099337A (en) * | 1989-10-31 | 1992-03-24 | Cury Brian L | Method and apparatus for producing customized video recordings |
US5151793A (en) * | 1990-02-26 | 1992-09-29 | Pioneer Electronic Corporation | Recording medium playing apparatus |
US5428401A (en) * | 1991-05-09 | 1995-06-27 | Quantel Limited | Improvements in or relating to video image keying systems and methods |
US5264933A (en) * | 1991-07-19 | 1993-11-23 | Princeton Electronic Billboard, Inc. | Television displays having selected inserted indicia |
US5566251A (en) * | 1991-09-18 | 1996-10-15 | David Sarnoff Research Center, Inc | Video merging employing pattern-key insertion |
US5982350A (en) * | 1991-10-07 | 1999-11-09 | Eastman Kodak Company | Compositer interface for arranging the components of special effects for a motion picture production |
US7079176B1 (en) * | 1991-11-25 | 2006-07-18 | Actv, Inc. | Digital interactive system for providing full interactivity with live programming events |
US5381184A (en) * | 1991-12-30 | 1995-01-10 | U.S. Philips Corporation | Method of and arrangement for inserting a background signal into parts of a foreground signal fixed by a predetermined key color |
US5830065A (en) * | 1992-05-22 | 1998-11-03 | Sitrick; David H. | User image integration into audiovisual presentation system and methodology |
US6425825B1 (en) * | 1992-05-22 | 2002-07-30 | David H. Sitrick | User image integration and tracking for an audiovisual presentation system and methodology |
US5553864A (en) * | 1992-05-22 | 1996-09-10 | Sitrick; David H. | User image integration into audiovisual presentation system and methodology |
US20030148811A1 (en) * | 1992-05-22 | 2003-08-07 | Sitrick David H. | Image integration, mapping and linking system and methodology |
US20080085766A1 (en) * | 1992-05-22 | 2008-04-10 | Sitrick David H | Image integration with replaceable content |
US7137892B2 (en) * | 1992-05-22 | 2006-11-21 | Sitrick David H | System and methodology for mapping and linking based user image integration |
US5681223A (en) * | 1993-08-20 | 1997-10-28 | Inventures Inc | Training video method and display |
US6198503B1 (en) * | 1993-08-20 | 2001-03-06 | Steve Weinreich | Infra-red video key |
US20030051255A1 (en) * | 1993-10-15 | 2003-03-13 | Bulman Richard L. | Object customization and presentation system |
US6351265B1 (en) * | 1993-10-15 | 2002-02-26 | Personalized Online Photo Llc | Method and apparatus for producing an electronic image |
US5500684A (en) * | 1993-12-10 | 1996-03-19 | Matsushita Electric Industrial Co., Ltd. | Chroma-key live-video compositing circuit |
US6122013A (en) * | 1994-04-29 | 2000-09-19 | Orad, Inc. | Chromakeying system |
US5751337A (en) * | 1994-09-19 | 1998-05-12 | Telesuite Corporation | Teleconferencing method and system for providing face-to-face, non-animated teleconference environment |
US6061532A (en) * | 1995-02-24 | 2000-05-09 | Eastman Kodak Company | Animated image presentations with personalized digitized images |
US6072933A (en) * | 1995-03-06 | 2000-06-06 | Green; David | System for producing personalized video recordings |
US5953076A (en) * | 1995-06-16 | 1999-09-14 | Princeton Video Image, Inc. | System and method of real time insertions into video using adaptive occlusion with a synthetic reference image |
US6522787B1 (en) * | 1995-07-10 | 2003-02-18 | Sarnoff Corporation | Method and system for rendering and combining images to form a synthesized view of a scene containing image information from a second image |
US6835887B2 (en) * | 1996-09-26 | 2004-12-28 | John R. Devecka | Methods and apparatus for providing an interactive musical game |
US6144755A (en) * | 1996-10-11 | 2000-11-07 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | Method and apparatus for determining poses |
US6141060A (en) * | 1996-10-22 | 2000-10-31 | Fox Sports Productions, Inc. | Method and apparatus for adding a graphic indication of a first down to a live video of a football game |
US6283858B1 (en) * | 1997-02-25 | 2001-09-04 | Bgk International Incorporated | Method for manipulating images |
US6314197B1 (en) * | 1997-08-22 | 2001-11-06 | International Business Machines Corporation | Determining an alignment estimation between two (fingerprint) images |
US6750919B1 (en) * | 1998-01-23 | 2004-06-15 | Princeton Video Image, Inc. | Event linked insertion of indicia into video |
US6115052A (en) * | 1998-02-12 | 2000-09-05 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | System for reconstructing the 3-dimensional motions of a human figure from a monocularly-viewed image sequence |
US6624853B1 (en) * | 1998-03-20 | 2003-09-23 | Nurakhmed Nurislamovich Latypov | Method and system for creating video programs with interaction of an actor with objects of a virtual space and the objects to one another |
US6285408B1 (en) * | 1998-04-09 | 2001-09-04 | Lg Electronics Inc. | Digital audio/video system and method integrates the operations of several digital devices into one simplified system |
US6086380A (en) * | 1998-08-20 | 2000-07-11 | Chu; Chia Chen | Personalized karaoke recording studio |
US20080090679A1 (en) * | 1999-01-05 | 2008-04-17 | Browne H Lee | Video instructional system and method for teaching motor skills |
US6881067B2 (en) * | 1999-01-05 | 2005-04-19 | Personal Pro, Llc | Video instructional system and method for teaching motor skills |
US6350199B1 (en) * | 1999-03-16 | 2002-02-26 | International Game Technology | Interactive gaming machine and method with customized game screen presentation |
US6266452B1 (en) * | 1999-03-18 | 2001-07-24 | Nec Research Institute, Inc. | Image registration method |
US6126449A (en) * | 1999-03-25 | 2000-10-03 | Swing Lab | Interactive motion training device and method |
US6384821B1 (en) * | 1999-10-04 | 2002-05-07 | International Business Machines Corporation | Method and apparatus for delivering 3D graphics in a networked environment using transparent video |
US7230653B1 (en) * | 1999-11-08 | 2007-06-12 | Vistas Unlimited | Method and apparatus for real time insertion of images into video |
US6335765B1 (en) * | 1999-11-08 | 2002-01-01 | Weather Central, Inc. | Virtual presentation system and method |
US7161614B1 (en) * | 1999-11-26 | 2007-01-09 | Sanyo Electric Co., Ltd. | Device and method for converting two-dimensional video to three-dimensional video |
US7015978B2 (en) * | 1999-12-13 | 2006-03-21 | Princeton Video Image, Inc. | System and method for real time insertion into video with occlusion on areas containing multiple colors |
US7106906B2 (en) * | 2000-03-06 | 2006-09-12 | Canon Kabushiki Kaisha | Moving image generation apparatus, moving image playback apparatus, their control method, and storage medium |
US7613350B2 (en) * | 2000-03-06 | 2009-11-03 | Canon Kabushiki Kaisa | Moving image generation apparatus, moving image playback apparatus, their control method, and storage medium |
US7209181B2 (en) * | 2000-03-08 | 2007-04-24 | Mitchell Kriegman | System and method for compositing of two or more real images in a cinematographic puppetry production |
US6807290B2 (en) * | 2000-03-09 | 2004-10-19 | Microsoft Corporation | Rapid computer modeling of faces for animation |
US7221395B2 (en) * | 2000-03-14 | 2007-05-22 | Fuji Photo Film Co., Ltd. | Digital camera and method for compositing images |
US7123779B2 (en) * | 2000-06-02 | 2006-10-17 | Koninklijke Philips Electronics, N.V. | Method and apparatus for merging images into a composite image |
US6535269B2 (en) * | 2000-06-30 | 2003-03-18 | Gary Sherman | Video karaoke system and method of use |
US20020130889A1 (en) * | 2000-07-18 | 2002-09-19 | David Blythe | System, method, and computer program product for real time transparency-based compositing |
US6950542B2 (en) * | 2000-09-26 | 2005-09-27 | Koninklijke Philips Electronics, N.V. | Device and method of computing a transformation linking two images |
US6954498B1 (en) * | 2000-10-24 | 2005-10-11 | Objectvideo, Inc. | Interactive video manipulation |
US6763148B1 (en) * | 2000-11-13 | 2004-07-13 | Visual Key, Inc. | Image recognition methods |
US20050151743A1 (en) * | 2000-11-27 | 2005-07-14 | Sitrick David H. | Image tracking and substitution system and methodology for audio-visual presentations |
US7116330B2 (en) * | 2001-02-28 | 2006-10-03 | Intel Corporation | Approximating motion using a three-dimensional model |
US7034537B2 (en) * | 2001-03-14 | 2006-04-25 | Hitachi Medical Corporation | MRI apparatus correcting vibratory static magnetic field fluctuations, by utilizing the static magnetic fluctuation itself |
US7181081B2 (en) * | 2001-05-04 | 2007-02-20 | Legend Films Inc. | Image sequence enhancement system and method |
US6937295B2 (en) * | 2001-05-07 | 2005-08-30 | Junaid Islam | Realistic replication of a live performance at remote locations |
US20030007700A1 (en) * | 2001-07-03 | 2003-01-09 | Koninklijke Philips Electronics N.V. | Method and apparatus for interleaving a user image in an original image sequence |
US6873724B2 (en) * | 2001-08-08 | 2005-03-29 | Mitsubishi Electric Research Laboratories, Inc. | Rendering deformable 3D models recovered from videos |
US6816159B2 (en) * | 2001-12-10 | 2004-11-09 | Christine M. Solazzi | Incorporating a personalized wireframe image in a computer software application |
US20030108329A1 (en) * | 2001-12-12 | 2003-06-12 | Meric Adriansen | Advertising method and system |
US20050028193A1 (en) * | 2002-01-02 | 2005-02-03 | Candelore Brant L. | Macro-block based content replacement by PID mapping |
US7495689B2 (en) * | 2002-01-15 | 2009-02-24 | Pelco, Inc. | Multiple simultaneous language display system and method |
US7400752B2 (en) * | 2002-02-21 | 2008-07-15 | Alcon Manufacturing, Ltd. | Video overlay system for surgical apparatus |
US6771303B2 (en) * | 2002-04-23 | 2004-08-03 | Microsoft Corporation | Video-teleconferencing system with eye-gaze correction |
US7697787B2 (en) * | 2002-06-06 | 2010-04-13 | Accenture Global Services Gmbh | Dynamic replacement of the face of an actor in a video movie |
US20040152058A1 (en) * | 2002-06-11 | 2004-08-05 | Browne H. Lee | Video instructional system and method for teaching motor skills |
US6919892B1 (en) * | 2002-08-14 | 2005-07-19 | Avaworks, Incorporated | Photo realistic talking head creation system and method |
US7027054B1 (en) * | 2002-08-14 | 2006-04-11 | Avaworks, Incorporated | Do-it-yourself photo realistic talking head creation system and method |
US20040100581A1 (en) * | 2002-11-27 | 2004-05-27 | Princeton Video Image, Inc. | System and method for inserting live video into pre-produced video |
US7268834B2 (en) * | 2003-02-05 | 2007-09-11 | Axis, Ab | Method and apparatus for combining video signals to one comprehensive video signal |
US20060125962A1 (en) * | 2003-02-11 | 2006-06-15 | Shelton Ian R | Apparatus and methods for handling interactive applications in broadcast networks |
US7319493B2 (en) * | 2003-03-25 | 2008-01-15 | Yamaha Corporation | Apparatus and program for setting video processing parameters |
US20090041422A1 (en) * | 2003-05-02 | 2009-02-12 | Megamedia, Llc | Methods and systems for controlling video compositing in an interactive entertainment system |
US7528890B2 (en) * | 2003-05-02 | 2009-05-05 | Yoostar Entertainment Group, Inc. | Interactive system and method for video compositing |
US7646434B2 (en) * | 2003-05-02 | 2010-01-12 | Yoostar Entertainment Group, Inc. | Video compositing systems for providing interactive entertainment |
US7649571B2 (en) * | 2003-05-02 | 2010-01-19 | Yoostar Entertainment Group, Inc. | Methods for interactive video compositing |
US20060262696A1 (en) * | 2003-08-20 | 2006-11-23 | Woerlee Pierre H | Method and device for recording information on a multilayer information carrier |
US7285047B2 (en) * | 2003-10-17 | 2007-10-23 | Hewlett-Packard Development Company, L.P. | Method and system for real-time rendering within a gaming environment |
US7324166B1 (en) * | 2003-11-14 | 2008-01-29 | Contour Entertainment Inc | Live actor integration in pre-recorded well known video |
US20050215319A1 (en) * | 2004-03-23 | 2005-09-29 | Harmonix Music Systems, Inc. | Method and apparatus for controlling a three-dimensional character in a three-dimensional gaming environment |
US7559841B2 (en) * | 2004-09-02 | 2009-07-14 | Sega Corporation | Pose detection method, video game apparatus, pose detection program, and computer-readable medium containing computer program |
US20060136979A1 (en) * | 2004-11-04 | 2006-06-22 | Staker Allan R | Apparatus and methods for encoding data for video compositing |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090041422A1 (en) * | 2003-05-02 | 2009-02-12 | Megamedia, Llc | Methods and systems for controlling video compositing in an interactive entertainment system |
US20090237566A1 (en) * | 2003-05-02 | 2009-09-24 | Yoostar Entertainment Group, Inc. | Methods for interactive video compositing |
US20090237565A1 (en) * | 2003-05-02 | 2009-09-24 | Yoostar Entertainment Group, Inc. | Video compositing systems for providing interactive entertainment |
US7646434B2 (en) | 2003-05-02 | 2010-01-12 | Yoostar Entertainment Group, Inc. | Video compositing systems for providing interactive entertainment |
US7649571B2 (en) | 2003-05-02 | 2010-01-19 | Yoostar Entertainment Group, Inc. | Methods for interactive video compositing |
US20110025918A1 (en) * | 2003-05-02 | 2011-02-03 | Megamedia, Llc | Methods and systems for controlling video compositing in an interactive entertainment system |
US8625845B2 (en) | 2005-08-06 | 2014-01-07 | Quantum Signal, Llc | Overlaying virtual content onto video stream of people within venue based on analysis of the people within the video stream |
US20100142928A1 (en) * | 2005-08-06 | 2010-06-10 | Quantum Signal, Llc | Overlaying virtual content onto video stream of people within venue based on analysis of the people within the video stream |
US20070204295A1 (en) * | 2006-02-24 | 2007-08-30 | Orion Electric Co., Ltd. | Digital broadcast receiver |
US7978208B2 (en) * | 2007-04-16 | 2011-07-12 | General Electric Company | Systems and methods for multi-source video distribution and composite display |
US20080252784A1 (en) * | 2007-04-16 | 2008-10-16 | General Electric Company | Systems and methods for multi-source video distribution and composite display |
US20100027961A1 (en) * | 2008-07-01 | 2010-02-04 | Yoostar Entertainment Group, Inc. | Interactive systems and methods for video compositing |
US20100031149A1 (en) * | 2008-07-01 | 2010-02-04 | Yoostar Entertainment Group, Inc. | Content preparation systems and methods for interactive video systems |
US8824861B2 (en) | 2008-07-01 | 2014-09-02 | Yoostar Entertainment Group, Inc. | Interactive systems and methods for video compositing |
US9143721B2 (en) | 2008-07-01 | 2015-09-22 | Noo Inc. | Content preparation systems and methods for interactive video systems |
US20130074119A1 (en) * | 2011-09-19 | 2013-03-21 | Myung Ryul Choi | Multimedia apparatus and method for controlling multimedia apparatus |
US20130235059A1 (en) * | 2012-03-08 | 2013-09-12 | Mitsubishi Electric Corporation | Image compositing apparatus |
US9613597B2 (en) * | 2012-03-08 | 2017-04-04 | Mitsubishi Electric Corporation | Apparatus and method for image compositing based on detected presence or absence of base image |
US10332560B2 (en) | 2013-05-06 | 2019-06-25 | Noo Inc. | Audio-video compositing and effects |
Also Published As
Publication number | Publication date |
---|---|
US20090237565A1 (en) | 2009-09-24 |
CA2523680C (en) | 2015-06-23 |
US20090041422A1 (en) | 2009-02-12 |
AU2004237705B2 (en) | 2009-09-03 |
US20110025918A1 (en) | 2011-02-03 |
JP4855930B2 (en) | 2012-01-18 |
WO2004100535A1 (en) | 2004-11-18 |
US7649571B2 (en) | 2010-01-19 |
CA2523680A1 (en) | 2004-11-18 |
JP2006527529A (en) | 2006-11-30 |
EP1621010A1 (en) | 2006-02-01 |
US20090237566A1 (en) | 2009-09-24 |
AU2004237705A1 (en) | 2004-11-18 |
US7528890B2 (en) | 2009-05-05 |
US20040218100A1 (en) | 2004-11-04 |
US7646434B2 (en) | 2010-01-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7528890B2 (en) | Interactive system and method for video compositing | |
US5974218A (en) | Method and apparatus for making a digest picture | |
US8824861B2 (en) | Interactive systems and methods for video compositing | |
US7324166B1 (en) | Live actor integration in pre-recorded well known video | |
JP4327370B2 (en) | Video mixer equipment | |
USRE40688E1 (en) | System for producing personalized video recordings | |
US20060136979A1 (en) | Apparatus and methods for encoding data for video compositing | |
US7164441B1 (en) | Video image producing method and apparatus | |
US8213774B2 (en) | Spotlight effect in video processing and playback | |
JP2006135808A (en) | Apparatus and method for reproduction | |
JP2005026739A (en) | Video content producing apparatus and program thereof | |
KR20000067376A (en) | A system for producing live image using song accompaniment system | |
KR19990014570A (en) | Instant Music Video Production System | |
KR950006447B1 (en) | Osd screen recording control method of camcorder | |
KR19990034681U (en) | Video Letter Production System | |
JP5097149B2 (en) | Content data playback device | |
Muller | Videotape Post Production: A Survey of Methods and Equipment | |
KR20020057916A (en) | The movie composition vending machine with chroma key | |
JP2003186484A (en) | Color superimposer for karaoke device | |
JP3840894B2 (en) | Image information processing method | |
KR20010096687A (en) | Self apparatus for composing image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: YOOSTAR ENTERTAINMENT GROUP, INC., NEW YORK Free format text: MERGER;ASSIGNOR:MEGAMEDIA, LLC;REEL/FRAME:022089/0704 Effective date: 20080702 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |