Load Video From URL in Quartz Composer Visualizer - quartz-composer

I'm attempting to make a quartz composer composition that takes streaming video input from 16 different networked video cameras and outputs this video on a grid of 16 attached displays.
I'm attempting to do this in Quartz Composer and QC Visualizer with a dedicated processing composition that essentially has 16 movie importer patches with published outputs and a rendering composition with 16 bilboards arranged in a grid (where the bilboards are attached to the movie importers). This works correctly when I use local videos in the movie importers, however, whenever I try to load streamed videos (which are QT videos), QC Visualizer crashes when I try to run the compositions.
Any suggestions on how to fix this, or any other way I can accomplish my goal?
Thanks.

Related

Display RTSP stream in Adobe AIR

I am working on a project that involves displaying video feed from an IP camera using Adobe AIR. I know that Flash does not have a native support of RTSP protocol and therefore I am evaluating all possible routes I can take to solve this issue:
Use Adobe Media Server to convert incoming RTSP stream to RTMP and then use Flash API (NetConnection & NetStream) directly.
Write a custom class to fetch, decode and display the stream in adobe AIR. [I am unable to confirm if this is possible due to insufficient info on the net]
Give up on RTSP and instead fetch JPEG/MJPEG sequence of images and display them in AIR relatively easily but with doubtful live performance. [due to JPEG/MJPEG refresh interval of IP camera and same interval separately in AIR]
Use DirectShow Video Source Filter for JPEG and M-JPEG IP Cameras to process the JPEG/MJPEG stream, create a virtual Webcam device (the filter does this automatically) and then use Camera class to display the video feed in AIR.
Use webcam 7 - A software designed to handle RTSP, JPEG/MJPEG and other stream protocols for many camera brands/models. It installs a driver in system that creates a virtual camera, and that all the other applications can then use as a normal webcam.
Unfortunately this software is buggy and often becomes unstable (could be with my particular camera model only though) and might even crash.
Are there any better, easier options that might not require any third-party software?
EDIT:
In case anybody else bumps into same problem:
As suggested by Rudolfs Bundulis, I decided to write a NativeProcess (ANE) that uses FFMPEG to fetch the RTSP stream data, transcode it, and feed it to Flash player.
You might want to look at these for more specific steps:
http://www.purplesquirrels.com.au/2013/02/converting-video-with-ffmpeg-and-adobe-air/
https://www.youtube.com/watch?v=6N7eN9wvAGQ
Take the route described in option 2 - write a Adobe AIR native extension (ANE) that uses FFMpeg to handle the RTSP stream, decode it and pass the RGB data back to AIR for rendering. The hardest part would be compiling FFmpeg if you need cross platform functionality, however, since you mention DirectShow that is Windows only, then I assume you are bound to Windows. Zeranoe provides prebuild FFmpeg libraries for Windows, Stackoverflow has a lot of topics on decoding a stream using FFmpeg and then all you need is a callback to AIR and you're good.

as3, why does embedded images does memory nomnom after compilation?

Situation
~~~~~~~~
I'm coding an as3 video game and I'm rather surprised when I'm checking the size of my project compared to the compiled swf.
My project size 28 mo.
- assets is 26 Mo
- src is 2 Mo
When I'm compiling it with embedded images, I get 242 Mo swf.
When I'm compiling it without any embedded resource, I get 18 Mo swf.
Question:
~~~~~~~~
How can assets simply embedded in a project goes from 26 Mo to (242 - 18 =)224 Mo ?
(I'm looking for a technical answer).
Notes:
~~~~~
I don't have any issue whatsoever concerning runtime.
I understand I have to use a loader and not embed big resource in my project, I just want to know why.
Perhaps your assets are somehow getting included multiple times. Luckily, there are some tools out there that can help you figure out what's going on:
In Adobe Flash CC (and similarly in other versions) you can go to File -> Publish Settings -> Flash (.swf) -> Advanced and check the box Generate Size Report. This will output some text in the output panel on publish that shows you the sizes and categories of everything in your swf.
Or, you could try the Visual Size Report add-on. Never tried it myself, so not sure if it's good or not.
If you're using FlashDevelop, and have a .swf as part of a project, you can see some details about its size. Here's an example.
Finally, if you're compiling with flex and mxmlc, you can add the -link-report filename option to generate a report that'll help you out.

Migrate from Flash AS3 to AIR

I've been developing a project in AS3 but decided to switch to AIR instead, as I found out it's impossible to save files on user's hard drive without prompt appearing. My question is, what changes do I have to expect? is the code written in same AS3 syntax/style? Did instantiating objects / drawing shapes / positioning system / stage change in AIR? thanks.
Everything is the same from a code perspective, except that the AIR SDK includes quite a number of new APIs that are less restrictive and geared toward application development since the deployment target is no longer a web browser.
If you can, you should take a day or 2 and read thru the documentation so you will know what's available to you. Make sure to select the most recent versions of AIR and Flash Player under Packages and Class Filters: Runtimes so the docs are populated with what you need.

How to integrate, run or combine a Flash game with my Titanium Appcelerator app?

I have developed three different games in Flash (Action Script 3) and they run well on my smartphone when I export them via Adobe Air.
I am creating an app over Titanium, and I intend to integrate these three games inside my main Titanium app. The problem is that Adobe Air only exports APKs files.
So I would like to know if there is anyway of making the games run inside the main app, without the need of calling external apps on the market, for example.
Thanks very much
One way to achieve this would be use modular architecture. Meaning, the Main App would act as a wrapper for the three games.
In other words, you can load external SWF files (the games) in your main app using the SWFLoader class. Here is some more info about this class and how to use it, as well as module-wrapper communication info:
http://help.adobe.com/en_US/FlashPlatform/reference/actionscript/3/mx/controls/SWFLoader.html
Have great day.

create a composite from bitmap and video file with FFmpeg or other binary via as3 frontend

After receiving much help with reading all the great stackoverflow topics in the past, I finally have to post a question myself.
For a client I need to create some sort of video-editor for dummies,
which has to generate a video file as output.
The editor has to load a movie-file scale and rotate it to a certain degree, and generate a composite video of a background bitmap and the rotated and placed video.
The frontend will be done in Flash/AS3 and has to use some background tools for processing the video.
Can I use FFmpeg to generate such a composite? Or is there any other good background task available?
edit:
update 19.12. ... still did not find a solution... any ideas from others?
thanks!
I don't think ffmpeg is the best tool for compositing. Instead, you could simply have Flash do the compositing, create the frames (as BitmapData objects) and upload them to some server-side script.
Then once all the frames have been uploaded, use Flash to call a second script that will build the video using ffmpeg.