We are developing a project and a part of this project is showing live preview on web page. I should use multicast stream because there will be too many clients and none of them will directly connect to camera. I want to use multicast property of camera and no need to increase network traffic. I want to this very simple, I mean I want to show the stream on html img tag or something like this and give the source path (for ex : ) multicast IP address of the camera. I googled and could not find any clear solution. By the way I want to support all browsers so html img tag I prefered. I do not want to embed any video stream plugin because cross platform supporting. I need ideas or suggestion and a clear example. Thanks in advance..
There are many ways,
you can directly insert rtsp link inside your webpage using vlc plugin or Quicktime plugin, or you can make a server to read stream from the camera(may be using nodejs or gstreamer or opencv etc) and broadcast it to a http ip or send it in base64 images to webport and on the website you can read it using websockets, socket.io library
I am trying to use www.rtcmulticonnection.org, spefically: http://www.rtcmulticonnection.org/docs/onstream/
I want to be able to grab the video element and connect it to a frequency analyzer.
I have tried $("video").get(0).connect
But that is not even a function, attempting on chrome. I do have an audiocontext, and a webrtc connection working. Any thoughts?
Are you looking for something like this example which queries a local stream volume using the WebAudio API?
In Chrome this has only been recently (M49 or M50) been fixed to work with remote streams received via a RTCPeerConnection.
Note that you will need the MediaStream object typically, not the video element.
I'm making a game that changes some of it's object depending on what music is playing. After each song has ended I want my audio context to load in a new source and analyze that. However whenever I tried to do that I've gotten the error that an audio object or buffer can't be called twice.
After some researching I learned that ctx.createMediaElementSource(MyHTML5AudioEl) lets you create a sourceNode that takes the source from a HTML5 object. With this I was able to loop through different song.
However for my game I need to play/analyze a 30 seconds "remote url" that comes out of the Spotify API. I might be wrong but ctx.createMediaElementSource(MyHTML5AudioEl)does not allow you to analyze a source that is on a remote site.
Also the game needs to work on Mobile Chrome, which createMediaElementSource(MyHTML5AudioEl) does not seem to work on.
I might be on the completely wrong path here but my main question is:
How can I switch remote songs urls in web audio api. With it being compatible with mobile chrome.
Thanks!
First, as you found out, you can't set the buffer again for an AudioBufferSource. The solution is to create a new one. AudioBufferSources are intended to be light-weight objects that you can easily create and use.
Second, in Chrome 42 and newer, createMediaElementSource requires appropriate CORS access so you have to make sure the remote url sends the appropriate headers and you set the crossOrigin attribute appropriately.
Finally, Mobile Chrome currently does not pass the data from an audio element through createMediaElementSource.
Can we record video (capture/ streaming) by actoinscript 3 ? and upload the same to server ?
Anybody have any reference link for it ?
your help will be appreciable
Thank you
Yes, there is support for recording video and surely you can send this data to the server.
There are many tutorials available, you can find them with a simple google search:
https://influxis.com/simple-as3-recorder/
https://code.google.com/p/flvrecorder/ --- has a sample
http://www.purplesquirrels.com.au/2012/12/record-and-play-back-video-with-air-for-ios-on-ipad/
and many, many others.
If you want to use free server, where to send video to, I suggest to look into Red5, it seems that it's not supported for a while now, but the latest releases are running well.
Red5: http://www.red5.org
You can capture audio and video from a webcam using Flash/ActionScript.
The encoded audio and video data is streamed (through rtmp) from the Flash client running in a browser to a media server like Red5, Wowza and AMS where it is saved in .flv, mp4 or .f4v video files.
For the exact client AS3 code see this answer.
Every time I try to view a video file on my server I get this error on iOS in Safari, Chrome.
I am using a blob server and then an Apache server so I am not sure what the problem is. However, when I only use Apache, I do get this error but then I have the video rendering too.
However when I render this using my server this is not working. Does anyone know why this is? The videos work fine on other devices and in browsers also works fine if accessed through Apache only.
The solution to this problem was just a work around. The reason being the that blob servers aren't streaming servers. iOS devices expect the videos to arrive in small chunks. So for instance a streaming server is able to do this. However, a blob server just hands the video as a blob which is not what the iOS device expects. Some browsers are smart enough to handle this but others not.
The way I solved this was to add the video files outside of the blob server in a folder within the project and then render this through the Apache server instead of serving it via the actual blob server we were using. I hope this helps.
I was also getting this error for some mp4 videos. Turns out it wasn't a server issue for me it was a video encoding issue.
Issue
A "moov atom" needs to be placed at the front of the video file. It serves as a table-of-contents for the video. That "moov atom" has to be read first for html streaming or it won't play on some devices.
The Fix
To fix, I used handbrake to transcode my video. Turn on 'web optimize' Also turning on zerolatency and 'fast decode' may help (found in the video tab).
We were getting a similar error here. I thought it may have been the streaming issue since our video was hosted in blob storage on Azure. After setting up a Media Service for streaming, the video still didn't work. It turns out, the cause of the bug for us was Safari using a Service Worker. Below is some further explanation of what we found:
Safari first sends a byte range request for a Video tag that expects a 206 response. However, if you use a Service worker, the response returns with a 200 and it appears Safari doesn't know how to handle this. Our solution was to exclude using a Service Worker for Safari.
We found this by using the network tab of the Safari debugger on a Macbook to troubleshoot the issue we were seeing on the iPad. Attached is a screenshot for comparison/reference. The left tab shows what the call should look like by default. The right tab shows what you would see if using a Service Worker.
Add the following line of code to your .htaccess (located in the root of your WordPress installation):
SetEnvIfNoCase Request_URI .(?:mp4)$ no-gzip dont-vary
The following screenshot is the new complete .htaccess
Reference: https://clickshepherd.com/blog/solved-elegant-themes-divi-and-cloudflare-mp4-media-error-formats-not-supported-or-sources-not-found/
In our case, we created a URL pattern for our blob assets and then set headers in that URL pattern definition page which sent back a mime type of 'video/mp4'. This should instruct the browser to treat the binary stream as chunked, which in turn meant we didn't need to download the whole thing before it started playing.
Google Cloud Platform Solution
This issue caused me a lot of headache, so I just wanted to add my specific solution here, if anyone else encounters this while deploying to Google Cloud Platform.
When trying to load MP4 videos in Safari, I was getting the same error:
"Failed to Load Resource, Plugin Handled Load"
Which was preventing the videos from playing.
Still, I wanted to try to keep everything inside Google Cloud, so I created a Storage Bucket for the site, and added the videos there.
Of course, trying to retrieve the videos from the storage URL from the main site resulted in a CORS error.
Fortunately, you can configure CORS pretty easily on storage buckets:
Configuring cross-origin resource sharing (CORS)
Once that configuration was deployed, I was able to retrieve and load the videos on the site in Safari without the "plugin handled load" error.
I saw the error "Failed to Load Resource" and though that this is reason, why my videos are not playing.
Turned out, my videos were missing the hvc1 tag. And when I added it - they're playing fine.
In my case issue was with H256 HEVC videos, but in your case some other encoding / tagging issues can be the reason.
In my case, issue was fixed with ffmpeg:
ffmpeg -i input.mp4 -tag:v hvc1 -acodec copy -c:v copy -movflags faststart out.mp4