Generate an XES file from an event log in CSV format - csv

Could someone help me out on how to generate an xes file from a csv file containing event logs. I tried many websites, tutorials but failing all the time.
Because, they aren't providing adequate information. Kindly help me.

Here you go: Importing CSV files using the ProM 6.5 Log package
You'll need the latest ProM-Version for that from www.promtools.org

Related

Chrome extension how to append or edit a csv file on pc

I am able to find some information on how to read a csv file on a computer but is there any way I can modify one? In my chrome extension I need to add data to each row one at a time after scraping some websites. Is there any better way then read csv, store data as variable and rewrite is everytime? This becomes problematic when the file gets large. I am looking for a way to “append ” to a existing file or a work around. Any suggestions appreciated.
Update: From comment I see it is not possible to read from file system. But is there anyway to read from within the extension directory? How should I do so if the csv file is included with in the zip file of the extension? Can I access them somehow? Code snippets would be helpful.
I'm in the middle of creating something which might help you. Right now you can upload the CSV file and append a "modifier". You can adjust the code according to your requirement. Here's the repo https://github.com/amanrOnly/CSV_Modifier

CSV file not recognised

I am creating a CSV file in my system and MFT to another. When they receive it, their job does not pick up the file. When they open the file in excel, save it locally and reload the same file, the job picks up the records. I can't figure out what could be wrong with the file I create or something wrong with their job? Anyone experienced something similar?
Appreciate any ideas.
Thanks
With such a few details no one rather than you can find what's wrong.
My suggestion would be to get two files: one - the original file that the job does not want to deal with; second - the file that the job can consume (saved via Excel). Then open this two files in a notepad and try to find any differences.

Automatically export file from a given software

I have a folder with hundreds of files that were saved on a specific format of a given software (in this case it is the Qualisys Track Manager and the file format is .qtm).
This software has the option of exporting the files to another format such as TSV, MAT, C3D,...
My problem: I want to export all my files to TSV format but the only way I know is open the software, go to File->Export->To TSV. And doing this for hundreds of files is time consuming. So I was thinking on writing a script where I could call my files, access the software and it would do the export automatically.
But I have no clue how to do this, I was thinking on writing a script on Notepad++, running on the command window and then I would get all the files on TSV format.
[EDIT] After some research I think maybe a Batch script or a PowerShell script may help me but I have no idea how to run automatically the commands of the software of if it is even possible... (I am using Windows10)
It is highly likely to be a perpetual file format(.qtm) and Powershell/batch would not understand it. Unless this file can be read in a known way (Text XML etc), they would not be able to convert it.
I googled it and seems QTM have a REST API interface. It would be the best chance you have. I'm not sure if the documentation is available publicly, I didn't find it. I'd recommend you contact their support for REST API document/ask if their REST API can handle this task/sample code to get you start.
Then you can make REST API calls with Invoke-RestMethod in a loop from powershell.

Extract excel metadata in Linux

I have used the "extract" command, but it never was able to find as much information as FOCA found on these excel spreadsheets I am dealing with.
For example, I am using the FOCA application to harvest and download files from the web. Afterwards, it is extracting metadata from all of the files.
With regards to excel files, it appears that these files are containing more metadata than the average pdf file. That being said, FOCA is able to detect printer names, email addresses, and a few other things that are stored within this spreadsheet file. However, I cannot find any way to get this same information in Linux using the "extract" command.
Anyone know a way to extract files within Linux and grab ALL of its metadata? Seems like the extract command may be limited from what I understand.
Thanks,
Excel files store a lot of meta data within the file, so you would have to parse the file itself to get at it. Since you're on Linux and can't use the Excel interop, you could try to use an Excel library like ExcelWriter or something similar. ExcelWriter is written for .Net, so you'd have to use mono.

BMC Remedy user 7.5 Could i make a macro that can read a .csv file?

I need to create a macro in BMC remedy user 7.5, that can read a csv file and update all items contain in the csv file?
Is it possible ?
I have to make a large bundle of item and edit there location.
Thank you
You can't create a macro. But you can use the Remedy Import Tool. It has the capability to automatically import a CSV file. It takes as inputs a mapping file and the CSV file (with full paths of course).
Check out the guide titled "BMC Remedy Action Request System 7.6.04 Configuration Guide". You should find what you're looking for there.
Best of luck,
Mike