I've been trying to create a multiviewport webgl application.
I got everything rendering quite nice using viewport+scissor for each view.
But now I would like to improve rendering and just render the view which is updated, so skip overdrawing.
I've made a little demo showing the idea: http://kile.stravaganza.org/lab/js/scissor/
As I understand scissor it's suposse that it will just render the current scissor box and keep the rest of the canvas untouched. But it seems that it just keeps clearing the whole canvas on each frame, no matter what I tried :(
This is the rendering code (The last view it's supossed to be rendered just once and keep it on each frame):
function drawScene()
{
gl.clearColor(1.0, 0.0, 0.0, 0.0);
gl.scissor(0,0,200,200);
gl.viewport(0,0,200,200);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
drawFigures();
gl.clearColor(0.0, 1.0, 0.0, 0.0);
gl.scissor(200,0,200,200);
gl.viewport(200,0,200,200);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
drawFigures();
gl.clearColor(0.0, 0.0, 1.0, 0.0);
gl.scissor(200,200,200,200);
gl.viewport(200,200,200,200);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
drawFigures();
// Render just once
if (first)
{
gl.clearColor(1.0, 1.0, 0.0, 0.0);
gl.scissor(0,200,200,200);
gl.viewport(0,200,200,200);
gl.clear(gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT);
drawFigures();
first=false;
}
}
Any idea how could I achieve this effect?
Thank you very much in advance
You can use the preserveDrawingBuffer attribute:
gl = canvas.getContext("experimental-webgl", { preserveDrawingBuffer: true });
It isn't recommended to use this in production. The WebGL specifications states:
While it is sometimes desirable to preserve the drawing buffer, it can
cause significant performance loss on some platforms. Whenever
possible this flag should remain false and other techniques used.
Techniques like synchronous drawing buffer access (e.g., calling
readPixels or toDataURL in the same function that renders to the
drawing buffer) can be used to get the contents of the drawing buffer.
If the author needs to render to the same drawing buffer over a series
of calls, a Framebuffer Object can be used.
This SO question contains also relevant information regarding preserveDrawingBuffer: When WebGL decide to update the display?
Related
I write an application with WebGL. In my fragment shader I set constant color to fragment but in my canvas I actually get a slightly different color. For example I write this code to my fragment shader:
precision mediump float;
void main(void){
gl_FragColor = vec4(0.0, 0.0, 1.0, 1.0);
}
In my canvas i see color #0005FA which is not what i want.
On the other hand when I set output color for example to vec4(1.0) then in my canvas I get correct color #FFFFFF.
I got this problem when using Google chrome. In Firefox colors are just fine. I use Debian OS, maybe it is related with this problem.
The difference in colors is not noticable but ruins my debbuging. I cannot properly read values. Does anybody know how to solve this?
I'm using the Forge Viewer to display some models converted from IFC (2x3) files.
For some of them, the quality is perfect, but for others the rendering is very poor like the picture bellow.
I've tried to export in SVF, SVF2 and same result.
I've tired different settings to load the model
let config = {
keepCurrentModels: true,
applyScaling: { to: "m" },
applyRefPoint: true,
globalOffset: { x: 0, y: 0, z: 0 }}; //make the view flicker on weird rendered model
None of those settings improved the view except globalOffset who makes the view flicker.
Have you any idea how to fix this ?
This kind of deformation of geometries is typically an indication that the model is very far from the origin. So far that the GPU rendering starts running into floating point precision issues.
Loading the model with globalOffset: new THREE.Vector3(0, 0, 0) should help in this case as it would basically force the viewer not to re-apply the original global offset (which is potentially very large) to all geometry vertices. I'm not sure why the view would flicker after using this option, though, that might be a separate issue.
I have a webgl canvas. It is being continuously updated (simulation).
Now I want to freeze the current content of the canvas. I am continuously getting updates for the simulation which I need to keep feeding to the visualizer. So my idea of achieving this is to clone the exact state of the current webgl canvas on to a new one, and hide the current one, which continues to get updated. Then I can remove the frozen one and the live simulation is being shown again.
I haven't been able to achieve this, and examples I've found on the web like this one:Any way to clone HTML5 canvas element with its content?
only apply to 2D canvases.
Google search didn't help much either.
This one:
how to copy another canvas data on the canvas with getContex('webgl')?
seemed promising but I haven't been able to figure out how to apply it.
Cloning the canvas appear to me to be an heavy and weird solution.
The simplest way to achieve what you want to do is to prevent the frame buffer to be presented (swapped, then cleared) to HTML canvas. Do do so, you simply have to avoid calling any gl.clear, gl.drawArrays or gl.drawElements during your loop.
For example suppose you have two functions, one running your simulation, the other your GL draw:
function simulate() {
// update simulation here
}
function draw() {
gl.clearColor(0.0, 0.0, 0.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT|gl.DEPTH_BUFFER_BIT);
// do drawing stuff here
gl.drawArrays(gl.TRIANGLES, 0, 12345);
// etc...
}
From this point, if you want to "freeze" the canvas content, you simply have to stop calling the "draw" function within your global loop. For example:
function loop() {
simulate();
if(!freeze) draw();
requestAnimationFrame(loop);
}
You may uses other methods to achieve the same effect. For example, you can draw your scene to a texture, then draw the texture on the canvas. By this way, you also can control when the texture is cleared and drawn again, while it still rendered in the canvas.
However, to implements the render-to-texture method, you will have some more heavy modification to done in your code: you'll need an additionnal shader to draw the texture on screen, and take some time to play with frameBuffer and renderBuffer objects.
I'm drawing a simple square in stage3D, but the quality of the numbers and the edges in the picture is not as high as it should be:
Here's the example with the (little) source code, I've put the most in one file.
http://users.telenet.be/fusion/SquareQuality/
http://users.telenet.be/fusion/SquareQuality/srcview/
I'm using mipmapping, in my shader I use "<2d, miplinear, repeat>", the texture is 256x256 jpg (bigger than on the image), also tried a png, tried "mipnearest" and tried without mipmapping. Anti-alias 4, but 10 or more doesn't help at all...
Any ideas?
Greetings,
Thomas
Are you using antialiasing for backBuffer?
// Listen for when the Context3D is created for it
stage3D.addEventListener(Event.CONTEXT3D_CREATE, onContext3DCreated);
function onContext3DCreated(ev:Event): void
{
var context3D:Context3D = stage3D.context3D;
// Setup the back buffer for the context
context3D.configureBackBuffer(stage.stageWidth, stage.stageHeight,
0, // no antialiasing (values 2-16 for antialiasing)
true);
}
I think that the size of your resource texture is too high. The GPU renders your scene pixel by pixel in the fragment shader. When it renders a pixel of your texture, the fragment shader gets a varying that represents the texture UV. The GPU simply takes the color of the pixel on that UV coordinate of your texture.
Now, when your texture size is too high, you will lose information because two neighboring pixels on the screen will correspond with non-neighboring pixels on the texture resource. For example: if you draw a texture 10 times smaller than the resource, you will get something like this (were each character corresponds with a pixel, in one dimension):
Texture: 0123456789ABCDEFGHIJKLM
Screen: 0AK
I'VE FOUND IT!!!
I went to the Starling forum and found an answer from Daniel from Starling:
"If you're using TRILINEAR, you're already using the best quality available. One additional thing you could try is to set the "antialiasing" value of Starling to a high value, e.g. 16, and see if that helps."
So I came across this article that said trilinear is only used when you put the argument "linear" in your fragment shader, in my example program:
"tex ft0, v0, fs0 <2d, linear, miplinear, repeat>".
Greetings,
Thomas
I want to animate the endpoint of a bezier curve to x,y coordinates in an html5 canvas without redrawing the entire stroke. Basically, I need to make the endpoint look as though it is draggable, and when dragged, affects the length of the line.
This is my current standard bezier stroke code:
var canvas = document.getElementById("myCanvas"),
context = canvas.getContext("2d"),
controlX1 = 140,
controlY1 = 10,
controlX2 = 388,
controlY2 = 10,
endX = 388,
endY = 170;
context.moveTo(188, 130);
context.bezierCurveTo(controlX1, controlY1, controlX2,
controlY2, endX, endY);
context.lineWidth = 10;
context.strokeStyle = "black";
context.stroke();
Does anyone have any ideas how this can be accomplished without using a library like Raphael; however, I am using jQuery, so that is an available resource.
without redrawing the entire stroke.
That's not possible. The way you animate things in HTML5 Canvas is by (clearing and) redrawing them.
library like Raphael
For the record, Raphael uses SVG, not HTML5 Canvas, and SVG makes this sort of thing much easier because it is a retained drawing surface.
Canvas is an immediate drawing surface. As soon as you draw something (like a curve) the canvas has no knowledge of what was drawn or where it is. You have to keep track of everything yourself. I feel like I parrot this a lot but I wrote a simple tutorial on learning to retain the necessary information to make canvas feel persistent like SVG that can be found here.
That being said, you might be better off using SVG (and not Canvas) if your planned app/site is not going to be very complex or intensive.