From the example documentation, there is the following section:
FiPy doesn’t plot or output anything unless you tell it to:
if __name__ == "__main__":
viewer = Viewer(vars=(phi,), datamin=0., datamax=1.)
I understand that the current configuration would result in an opening of a viewer using the matplotlib or Mayavi viewers. However, I would like to be able to export a .pvd or .xdmf file for consolidating the simulation.
Thanks for your help!
FiPy presently has no such capability, although I've experimented a bit with XDMF and multi-timestep Gmsh MSH files, but need to find time to get back to it. You can save individual data snapshots with the VTKViewer classes, but not time series.
vw = fp.VTKCellViewer(vars=(phi, psi))
vw.plot(filename="myFile.vtk")
If there are particular thinks you'd like to see, please update issue #132.
Related
I need to share an interactive plot made using the PlotlyJS package in Julia. According to the documentation of the package PlotlyJS.jl, I need to use the "savehtml" function and set "js" argument to ":embed" in order to view it offline (screenshot attached). However, I got an error "UndefVarError: savehtml not defined". Can anyone tell me what may cause this problem?
FYI, the "savefig" function can save the plot into an HTML but the HTML cannot be viewed on other machines.
It is also acceptable if there is another way to save an HTML plot that can be assessed from other machines. The interactive plot is generated by PlotlyJS.jl.
Thanks very much in advance.
This creates a standalone file that can be used on other machines.
However, those other machines need to have access to the internet:
p = PlotlyJS.Plot(sin.(1:0.1:10))
open("f.html","w") do f
PlotlyJS.PlotlyBase.to_html(f, p; include_plotlyjs="cdn", full_html=true)
end
I just checked that this is as far as you can do as of today (version v0.8.18) as there is a bug in the source code of PlotlyBase.
After having tried all solutions I have found on every github, I couldn't find a way to convert a customly trained YOLOv3 from darknet to a tensorflow format (keras, tensorflow, tflite)
By custom I mean:
I changed the number of class to 1
I set the image size to 576x576
I set the number of channels to 1 (grayscale images)
So far I am happy with the results on darknet, but for my application I need TFlite and I can't find working method for conversion that suits my case.
Anyone have succeed in doing something similar?
Thank you.
Do you have the resulting .weights file for your custom model?
If so, the following project by peace195 may help:
https://github.com/peace195/tensorflow-lite-YOLOv3
EDIT:
In the above link, use convert_weights_pb.py file to convert your .weights file to a .pb file.
Then use the .pb file as a saved model and convert it to a .tflite model using the following command.
tflite_convert --saved_model_dir=saved_model/ --output_file yolo_v3.tflite --saved_model_signature_key='predict'
Thanks Anton Menshov for your suggestion on improving the answer.
This is the most simplest and easy repo. Author has done a wonderful job and it works well with yolov3, yolv3-tiny and yolov-4. Please don't forget to change the coco.names under classes if you are training on custom classes.
Git link for the code
When I try to load a big CSV from a zip file, the execution log give me the following error:
----------------------------------------- Error details ------------------------------------------
Component [Clientes:CLIENTES1] finished with status ERROR.
The size of data buffer is only 100663296. Set appropriate parameter in defaultProperties file.
--------------------------------------------------------------------------------------------------
How can I set the appropriate parameter in defaultProperties file?
I tried this link, but my cloudconnect run configurations page is different from the link:
I've created the parameters file and filled the additional parameters with the right values like said the tutorial (code bellow) and the same error appear in the screen.
Name: -config; Value: new_buffer_size.txt
The new_buffer_size.txt content have just this line:DEFAULT_INTERNAL_IO_BUFFER_SIZE = 200000000
How can I solve this problem? I need to solve this before the world explodes.
CloudConnect is designed to develop ETL(s), which can be run on GoodData cloud workers and therefore some lower level settings are surpassed as in this case. The only legitimate way is to modify the ETL the way it can process the data with current settings. Regarding to docs, the referenced article is outdated. GoodData docs team is aware if it and they are preparing docs refactoring.
Note: As you have probably noticed, CloudConnect is being powered by Javlin's Clover ETL, therefore feel free to check their forums, as you would find there how to overcome the issue on lower level (no UI), but it would work only for data processing on the local machine.
So I was using the neo4jrestclient, and I noticed that in the class of QuerySequece, there's a .to_html()function (https://github.com/versae/neo4j-rest-client/blob/master/neo4jrestclient/query.py)
However, when I try using it I get the 'Unable to display the graph or the table' error.
I haven't found a working example of it. I was wondering if anyone has gotten this working.
Much thanks appreciated.
The function .to_html() is a function that IPython uses in order to render rich content inside Notebooks. When running inside a Notebook, neo4jrestclient asks for extra information to the Neo4j server, so it can draw the actual graph returned. Therefore, if you try to run a query inside an IPython Notebook, a D3 graph should be rendered automatically.
from neo4jrestclient.client import GraphDatabase, Node, Relationship
gdb = GraphDatabase(url="http://localhost:7474")
gdb.query("MATCH (me)-[r]-() RETURN me, r LIMIT 10")
A running example can be seen in this gist. Although it's still a work in progress. I think that I could add an option to populate the needed fields in case you wanted to use the .to_html() outside the IPython Notebook. All you need to do is to make neo4jrestclient believe that it's running inside of one by modifying the function neo4jrestclient.utils.in_ipnb() making it to always return True. Let me know if you would use that feature and I will add it.
On the other hand, I am developing ipython-cypher, to have a better integration of IPython, Pandas, NetworkX, and matplotlib with Neo4j, but it's still in alpha.
Update: Now you can add data_contents=True to return the extra data.
results = gdb.query(query, data_contents=True)
Data will be in results.rows and results.graph.
I want to know if there is a way to check if a file has been edited. I looked for methods that can make it for me on Google Apps library, but I have found nothing about it. I don't know if I searched wrong.
Basically, I need to take a file, take a measurable data (something like size) of this file and store on a variable. Then, I need to take that measurable data again, store on another variable, and compare if there was a change. I need a boolean return.
Anyone?
You could do the following (pseudo with links to documentation):
Get the file you want to check using the DocList Class.
Get the ID of that File once you have it using File.getID()
Get the last edit timestamp using File.getLastUpdated()
Store this value in a Spreadsheet, or maybe Script or User Properties.
When you want to check to see if the File was updated, simply File.getFileById()
Repeat step 3.
Then compare the two last-edited timestamps with an operator like !=, or do more complex comparisons on the Dates if you want.
Depending on the result of step 7, return true or false.
Google's documentation is great, for all their services. You just need to read it a bit to understand what kind of power you have through scripting. Hopefully my pseudo-method helps in solving your problem!
Look at the file update date: https://developers.google.com/apps-script/reference/drive/file#getLastUpdated() and for storing data look up the storing data section in the apps script help page.
You could also use the GAT General Audit Tool http://goo.gl/hzZ2yf... which reports when files were edited , viewed and much more.