OBJ to JS causes WebGL error - json

I downloaded a 3D mesh from Archive3D, I then convert it to .obj in 3DS MAX using these settings and finally I convert the .obj to .js using Three.js editor.
Then I create the scene and add the model, as shown here.
These are the errors that I get in the console:
[.WebGLRenderingContext]GL ERROR :GL_INVALID_OPERATION : glDrawElements: attempt to access out of range vertices in attribute 2 fly.html:1
WebGL: too many errors, no more errors will be reported to the console for this context.
What is the problem, is maybe the problem in the .js and how can I fix it?

It most likely means there's an error in the data. Many converters don't validate the data is actually correct.
The error means the one or more of the indices in your data is out of range for the data given. In other words lets say you had a 3 vertices. That means you can only have indices in the range of 0 to 2. If you had an index greater than 2 you'd get that error.
Whether the error is in the original data, in the converter to .obj, in the converter from .obj to .js we can't know without debugging through each of those steps.
You could write code to walk through the data when you load it and check that none of the indices are out of range. What to do if they are is up to you. You could try to remove them. You'd need to remove them in groups of 3 assuming you're drawing triangles. In other words, figure out what the smallest buffer in the data is (positions, normals, texcoords) then walk the indices 3 at a time. If any of the 3 indices is out of range, delete those 3 indices.
Did the file display when you loaded it into the Three.js editor? If that's the case when either there is a bug in the Three.js editor when exporting, or else the data some how got corrupted some other way.
You also mentioned code from this page That page has nothing whatsoever to do with three.js. The format for data on that page with that code is unrelated to three.js AFAIK. Data exported with the Three.js editor is only for using in Three.js

Related

How can I use the CSV file loading functionality of Dygraphs myself (to load CSV data and then myself add a new series to it before chart rendering)?

Since Dygraphs apparently does not have any functionality for adding separate series of data to a chart one at a time (but rather only loading all the data series of a chart at once from a CSV file, or an in-memory array of arrays) I'm looking to make some code to do this myself.
My reason? My problem/scenario is that I have a "base file" containing a series of data many million values large. I will then need to show many separate charts that display this large data series TOGETHER with a bunch of other respective smaller data series, and I'd very much rather not duplicate the large dataseries in a new CSV file on disk for each such chart, but rather first load the big "base data series" from the CSV "base file" directly from my Javascript, and then for each such chart integrate one such smaller data series with it before sending it off to rendering by means of a new Dygraph(...) call.
The CSV file loading functionality that already obviously exists somewhere inside the Dygraphs code is very nice, so I'd very much like to use it for this loading of the large "base data series" if possible, from a single separate CSV file.
So, in short, the question is:
How can I use the existing CSV file loading functionality of Dygraphs separately from inside my own code, in order to load arbitrary CSV files into the Dygraphs chart data array format in-memory, so that I can finally merge these multiple data series arrays using my own custom code?
What I'm hoping for is something like this:
loaded_data_series_1 = some_secret_internal_function_or_method_of_dygraphs('file1.csv');
loaded_data_series_2 = some_secret_internal_function_or_method_of_dygraphs('file2.csv');
merged_data_series = my_own_custom_dataseries_merging_code(loaded_data_series_1, loaded_data_series_2);
g = new Dygraph(document.getElementById('my_chart'), merged_data_series,{});
The key here would thus be to know what some_secret_internal_function_or_method_of_dygraphs() should be replaced with for this to work.
Could the Dygraph devs or anyone else possibly point me in the right direction here?
(I tried to look inside the Dygraphs code myself, but unfortunately got lost pretty quickly due to insufficient Javascript coding skills on my side)

D3, DC, CSV - How to stop scientific notation rounding numbers in js

I'm loading multiple .csv files, which I then merge on the basis of a common ID. In some cases, the ID is an integer > 12000000000000.
Initially I was loading CSV files, but switched to DSV in the hope of fixing the problem. I'm using queue.js, so my loading code looks like this:
queue()
.defer(d3.dsv("|", "text/plain"), 'portfolio2.csv')
.defer(d3.dsv("|", "text/plain"), 'ratings2.csv')
Although the files have an extension of .csv, they are pipe delimited and load fine.
I think the problem is that D3 or javascript or some part of my code loads in scientific notation, and somehow it overwrites the original data with a rounded number. So where I have an ID of 12000000110858, once I load the webpage, the number changes in the CSV file to 12000000000000. There is no line of code overwriting this ID field. It's as though the act of loading the file is enough by itself.
I've tried using pipe delimited and that suffers the same problem.
Of course, my app then falls over, because it can't perform the matching. Is anyone familiar with this problem and knows of a solution?
Thanks for any help.

export plots with netlogo

I am trying to export all the plots of my NetLogo model after simulation runs in a csv format with the primitive export-all-plots.
I haven't found yet the way to open this csv file with an external reader in order to get more clear plots. I tried with gnuplot but it looks like it's not able to open the csv format created with NetLogo:
"export-plots data (NetLogo 5.0.5)"
^
"C:\results\interface.csv", line 1: invalid command
How can I open csv plots with an external reader?
There are two complicating factors about NetLogo's plot export format. First, there's a three line header at the beginning (plus an empty line after) that just gives information about the model and when the data was generated. Next, there's data about the model settings, the plot state (pen colors and such). Finally, there's the data itself, which itself is somewhat complicated by the fact that you can have multiple pens per plot. So I'm not surprised gnuplot couldn't read it as is.
The table's are quite easy to use in GUI spreadsheet application, like Excel, LibreOffice's Calc, or Gnumeric. You can just select the data you want and generate the plots.
To do this at the command line, I'm afraid you might have to write a script to read it in. This should be pretty easy in something like Python or R. Just skip the metadata lines, and use a CSV parser to read in the rest.
You might also try using BehaviorSpace to generate the data, but make sure to use the table output. It let's you generate the data from many runs at once, and the format is a little more consistent. There are still 6 lines of metadata at the top, but you can just delete that. I believe this is more the standard practice in NetLogo.

Three.js (r64) - Blender JSON export miss normals for smooth shading

Using Three.js r64 I'd like to import from Blender an animated object with its smoothing groups, the file is exported as JSON through Three.js Blender exporter.
The animation part is working fine.
In Blender, the model looks fine (there is a small smoothing group around the central part).
Picture: http://www.defresne.fr/demo/so/three/smooth_shading/gears.png
I can achive to get the same result when exporting in OBJ with 'Smooth Groups' and 'Include Normals' options checked. However I can't get it working correctly while exporting a JSON file (with normals). Next are pictures of the scene, with a live demo.
Picture: (append next link with) three_gears.png
Live demo: http://www.defresne.fr/demo/so/three/smooth_shading/
I did intense lookup all over the web and couldn't find correct informations. Best is another question on SO which is a bit old (r55) and never got any accepted answer.
I did try to compute the object's normals with
geometry.computeFaceNormals();
geometry.computeVertexNormals();
but, obviously, it computes the whole object normals and result in a completly smoothed object.
So, what should be a correct approach to make JSON smoothing groups work in three.js ? Wait for a built-in function ? build it myself ? Modify the exporter ?
As three.js seems to load correctly OBJ and Collada models with smoothing groups, maybe I could borrow some of the code in these loaders to get the logic ?
Thanks for your help
[EDIT]
I just found something great !
In Blender, produce 2 exports of the model: first a JSON file, second an OBJ file. Load the second one with three.js online editor then convert it to get the geometry JSON...
I can collect the vertices, normals and faces of this freshly exported geometry and copy it to the first exported file.
It works fine ! I got nice shading groups. Even skinning works fine.
But it's a tedious way of processing and I wish I could save myself some extra conversions.
So does that mean there is a problem while exporting geometry from Blender ? Any idea why ?
Any help would be greatly appreciated !
Ok, I finally found what happens.
The r64 Three.js Blender exporter doesn't export smoothing groups so if you need to preserve these, there is no other solution than export the geometry to an OBJ file, then convert it with the python script 'convert_obj_three.py' avaible within Three.js repository. Converted file will have correct normals. (don't forget to check normal option while exporting the OBJ file)

three.js update geometry of json file

i am working on a project and i got stuck on this one thing, i got a bvh file with an animation which i put on a biped in 3dsmax then i export every frame in to a json file with the thee.js json exporter and then when it gets played i remove the old mesh and add a new one. This causes the model to upload multiple times, which ofcourse is not what i want.
I use the jsonloader and i wondered if there was a way to update only the geometry every frame? Also i made those seperate files because i dont use a skin and i dont have morphtargets because i use a bvh file in 3dsmax.
It would be awesome if someone could help me with this
Example: http://www.deschaatssport.nl/3dsportsvisualiser/versie3/?p=3# (press the play button)
Mathijs Jansen
Assuming your JSON exporter keeps the number and order of faces/vertices intact, you could do something like this:
Load the initial model as you do now
Set geometry.dynamic = true;
Load the rest of the JSON files with plain old AJAX (alternatively you could combine the frame JSON files to one file, whatever so you have access to all of them through your own code)
For each frame, loop the new frame vertex position array (found in the JSON for each frame) and overwrite the coordinates of your existing geometry vertices (geometry.vertices[i].x = myframevertices[i][0]; etc...)
Set geometry.verticesNeedUpdate = true; and render the frame.
Again I'm just assuming that each JSON file has the same amount and order of faces/vertices, and only the vertex positions change in each frame. I'm not completely sure if it works like that.
There has also been some (experimental?) work on BVH support with Three.js. You might have success searching around, I don't know if there is anything usable though. One related effort here: https://github.com/akjava/BVH-Motion-Creator