Deferring downloads instead of being uploaded with the job? - autodesk-forge

My goal is to update the viewer only and then offer the option to generate the items for download and I was wondering what is the best way to handle this.
Currently when we send a workitem it returns with the updated svf and updates the viewer along with creating and uploading the selected export formats(stl, stp, dwg), BOM, etc. This increases the time that the user is waiting for the workitem to return.
Is a new AppBundle required or can this be handled within the same one?
Thanks in advance for your help!

This can be achieved using 1 appbundle only. You just need multiple activities. Each activity can use the same appbundle, and will be defined to use different portion of appbundle which can be decided based on the command line flags. Then you can in each of the phase send workitem using needed activity.

Related

Logging in Revit Design Automation add-in

I want to send some diagnostic output to the default report.txt file.
In some posts it is shown that exceptions are logged to this report.txt file somehow (automatically or not?).
Also, I see in some samples that people do the logging with
System.Console.WriteLine(),
I've tried this, but still can't see it in the report file.
Could you tell me, how to achieve this?
I understand there is an option to create another log file and send it back with the result, but I think it would be easier to use this existing report.txt.
Thanks!
UPDATE: System.Console.WriteLine() works.
The reason why I didn't see the output was that my add-in failed to load.
So, it simply didn't reach this line of code.
Logging in Design Automation for Revit appbundles can indeed be done with System.Console.WriteLine. Anything sent to standard output will be capture in your workitems report.txt. For example.
The following code:
System.Console.WriteLine("Hello World!");
Will generate the following lines in the report.txt:
[04/23/2020 19:20:59] Hello World!

Cloudconnect CSV buffer size

When I try to load a big CSV from a zip file, the execution log give me the following error:
----------------------------------------- Error details ------------------------------------------
Component [Clientes:CLIENTES1] finished with status ERROR.
The size of data buffer is only 100663296. Set appropriate parameter in defaultProperties file.
--------------------------------------------------------------------------------------------------
How can I set the appropriate parameter in defaultProperties file?
I tried this link, but my cloudconnect run configurations page is different from the link:
I've created the parameters file and filled the additional parameters with the right values like said the tutorial (code bellow) and the same error appear in the screen.
Name: -config; Value: new_buffer_size.txt
The new_buffer_size.txt content have just this line:DEFAULT_INTERNAL_IO_BUFFER_SIZE = 200000000
How can I solve this problem? I need to solve this before the world explodes.
CloudConnect is designed to develop ETL(s), which can be run on GoodData cloud workers and therefore some lower level settings are surpassed as in this case. The only legitimate way is to modify the ETL the way it can process the data with current settings. Regarding to docs, the referenced article is outdated. GoodData docs team is aware if it and they are preparing docs refactoring.
Note: As you have probably noticed, CloudConnect is being powered by Javlin's Clover ETL, therefore feel free to check their forums, as you would find there how to overcome the issue on lower level (no UI), but it would work only for data processing on the local machine.

How to add manage temparory data in DotnetNuke?

I am beginner in DNN. I am creating a module which provides Login, Dashboard and Add-Update Form. I have data in JSON format. I want to store it temparory while user use the website. Data will be destroy as soon as user will close the website.
Currently I have created a folder in my Solution Explorer of project in Visual Basic and created 3 .json files which stores login_info.json, basic_info.json and auth_info.json. I write json data whenever user login and I make it blank when user logout.
Above method is working fine now but I afraid it will work when I will publish this module.
Also I may have situation where I need to store image some where. I don't know how I will manage.
Can anybody please guide me?
Is this proper way to store data temparory in DNN?
Is there any other better way?
After getting one of reply for Database Suggestion
Is there any table which same as User Meta in DotnetNuke?
You use the ConnectionString that is used by DNN and access the database as you would normally.
DotNetNuke.Common.Utilities.Config.GetConnectionString()
Or you can use the Data Access Layer that the DNN Framework supplies. For that take the Christoc Templates. In there is all you need to communicate with the DB.

File-Monitoring via Lua Script

Good evening,
I am currently developing a way to import machine created data from a csv sheet into a database.
The question I have is, is there a way to react to a change in a csv file with Lua.
The file gets a line in this format:
17162H,"801234500001",9/23/2016 12:33:30 PM,"INV"
Every time a scanner is finishing a scan process, added under the old lines, but there is no direct connection to the database, to trigger the script.
It doesn't matter if the change is detected via different file size, foldersize (of the folder that contains the file) or a change within the file information (like date of last opening), but I can't open and read in it permanently due performance reasons.
Also this is the first time I ask here, so sorry for my clunky way, I'll try to improve myself with that over time.
Take a look at linotify, it has lua bindings for inotify and looks like it should do the trick, using the "modify" event to trigger your script.
I use LibUV based variant in my spylog apllication
Usage:
file_monitor(path_to_file, {eol = '\r?\n'}, function(line)
...
end)
If you need to run this on Windows, you can use winapi library, which supports file watchers. Here is an example of how it's used in one of my projects; you'll need to call winapi.sleep() to allow time for the check to trigger.

Set file date with managed code in WP8

Is it possible to change the file creation date or file last write date from managed code in WP8?
I can read the date stamps using FileInfo, but these properties are read only. Using native code it looks like I can use the SetFileInformationByHandle api. My project does use native code so I can add a little helper function, but this seems like a lot of over kill.
Reason: I've got an online multiplayer game (4sFear) that lets people upload their own avatars. Currently I just set the source of an image to the http address of the avatar but I would like to be a little smart and cache the images locally. I can return the last time an avatar was updated before it needs to be displayed. I know I can store the dates that the avatars were updated separately, but it makes sense that I should just be able to set the last write date of file after creating it.
The SetFileInformationByHandle API is supported via DLL api-ms-win-core-file-l1-2-0.dll More info here: https://msdn.microsoft.com/library/windows/apps/jj662956%28v=vs.105%29.aspx#BKMK_ListofsupportedWin32APIs which can be called using DLLImports.
I don't have a machine to test but you could try the following: How to update the change time of a file from c#?