I'm creating a web app that is going to include a video.
Users are going to open my app in a new Chrome window, and screen share it via Zoom.
The video will never be in full screen mode, but it will fill up the whole window.
How can I optimize the video/window resolution for screen sharing so that the video quality will be the highest possible for viewers on Zoom?
Is it possible to make it optimal while keeping a small window size or do they have to use a large window size for the screen sharing to be high quality?
Thank you
When sharing a window, only the rasterized pixel output of that window is captured.
Therefore, if you need a high resolution version of your web page, the window that is opened must also be sized appropriately.
Additionally, video is particularly tricky because capturing video from a window requires a great deal of CPU and isn't normally done at full frame rate. Codecs used with screen sharing are optimized for a low frame rate, higher resolution, and quality. Because of these limitations, you are basically never going to get a good quality video shared from a web page to stream correctly over Zoom or any similar software.
You'll have to either make-do with the quality that exists now, or you'll have to skip the screen-sharing route entirely.
Related
Data passed to my physical monitors is separate from the data passed to screen captures. I am sure of this, because if I run a screen capture while playing a movie on chrome (but not Firefox!), most popular services will just show a black screen. This implies that visual data passed from a browser to your desktop is separate from the data going from your desktop to a screen capture application. But how? What are they doing to keep these separate? How does my OS know which data from the browser is fine to show to the screen capture program and which is not?
Another example of this phenomena is when certain less-than-ethical streamers use video game cheats that will show hidden player locations on their monitor's screen, but not to their audience on a live-stream.
When you play video using DRM and encryption, the decryption of that video isn't capturable via usual methods. In fact, if done correctly that video will be encrypted all the way up to the monitor via HDCP. Reality is that the whole stack of components used for DRM is unreliable, so it's more common you'll just get lower quality video if your system cannot ensure encryption.
Some resources:
DRM and HDCP
Encrypted Media Extensions API
I want to add a gif of an app that I'm working on to the app's website. Previously, I used quicktime to screen record my computer and then used EZGIF to convert the video into a gif. Unfortunetly, in order to get the gif to a reasonable size to embed on my website (~5MB), the quality goes completely down the drain (you can see the bad quality gif on the website now.
To show off enough functionality of the app to make it worth going on the hompepage, the original video that I'm recording is ~45 seconds.
Are there other, better methods for recording / compressing gifs for websites?
I have video that will be divided into 4 videos.
First the player will stream a lower resolution of the original video, then the user can zoom into the video to see more details, I need the player to stream one of the 4 videos - that's higher in resolution- based on where the user zoomed in.
How can I make that using VideoJS or any other video player ?
After searching, this is the answer ...
For zooming into the video, you can follow this tutorial: Zooming and rotating for video in HTML5 and CSS3
For switch streaming of videos in the same player, you can make that by changing the source on html5 video tag and make some calculations to know where the user zoomed in and hence change the source video.
As there is no response yet let me analyse the problem. This is by no means meant as a full answer, but other people will probably be able to answer parts of the problem:
First the player will stream a lower resolution of the original video,
This means you will need to create/use a video stream. There are plenty of plugins you can use for videostreaming, and depends on what you want. You can consider writing it yourself using for example C#'s System.IO objects and transforming the video in bytes(And putting it back together) The resolution would be easiest reached by just having a seperate video file for this step of the proces. (a lower resolution one used for streaming only)
then the user can zoom into the video to see more details, I need the player to stream one of the 4 videos - that's higher in resolution- based on where the user zoomed in.
So you need to trigger a zoom effect. This means you would need to detect zoom. This would be possible with Javascript in a webbrowser, if you want a browser based application. When that zoom is triggered you can retrieve what position the mouse is on the screen/in the div or on some sort of overlay. Depending on this position you could show another stream.
How can I make that using VideoJS or any other video player ?
Basically these steps above is how i would start looking into this specific case. Considering your VideoJS as a suggestion i assume this is browser based. This would probably mean using Javascript libraries, maybe combined with a server side language.
Thats as far as i can go. Maybe someone can pick up specific parts of the thing i wrote and help you a step further.
Have a nice day!
I need to be able to continue rendering a SWF file whilst it is off screen or minimized. Taken from the adobe website: "This is an automatic feature in Flash Player since version 10.1. Flash Player minimizes processing when SWF content goes off-screen."
I have extensively searched around for a solution on this. One solution suggested was to use the HTML parameter "hasPriority" and set it to true which will ensure some things are not paused. SWF content will stop rendering regardless of this when off screen or hidden.
Does anyone know if it is possible to disable this automatic feature so my SWF will continue to render off screen?
Thanks in advance for any help.
The flash virtual machine is specifically designed so that, while viewing flash in the browser, the VM is paused when the instance of the player loses window focus. This is necessary functionality in order to... well.. keep flash from utterly destroying your computer, forcing it catch ablaze and send it to the computer underworld. Just imagine what would happen if you had 3-4 flash sites open and rendering off screen on your tablet. It would die a horrible death. You cannot override this functionality.
I'm trying to create a video tag for use with Chrome only. I don't always know the dimensions of the video, but I would like to have it be the size of the window. I thought I could accomplish this by using "width=100%" and "height=100%", but I found that the built-in controls were hard to see. I reduced height to 98%. Most of the videos I am currently trying to play are 720p MP4's. I tried playing a 1080p (actual dimensions being 1920x1040), and it wouldn't work (the video player acted as if my source was wrong, but I could right click and successfully download the file). The file size was more than 3 gig, I'm not sure if that had anything to do with it.
Edit: I also checked that the codecs were the exact same, and they are between the smaller and larger videos.
Anybody else having the same or like issues?
I just wanted to post on here now that I've fixed this issue. The issue was the file size information is apparently at the end of normal MP4's. For streaming, we want this information at the front of the file so the player can decide how to buffer and what-not. Chrome must do some sort of quick file size check if it doesn't find the file size information at the beginning. I used a program called QT-Faststart which moves the file size information to the front of the file. Thus solving the issue.