want to unzip a specific file from zipped file - perlscript

I have a directory where zipped file is placed. I want to extract it but I need only xml document from it. I am able to extract whole file but I need only xml file. I need it in perlscript only on windows platform.

Related

Editing .json files in a zip folder, without unzipping

I am in the process of uploading a huge number of tests for my school (I am a computer science teacher). These come in the form of .h5p files. I need to parse information into the .h5p files from .txt documents, ready for uploading to Moodle courses. To do this, I have built an app to push the data from .txt files into the .json files in the .h5p file.
The problem is that my app converts the h5p to a zip, unzips it and then parses the information, rezips and then changes the extension again to h5p. Would you mind watching this video https://youtu.be/FTyQddAcWa8 and letting me know how I might be able to edit the .json files and then rezip ready for uploading to the Moodle courses? The files throw up errors once unzipped and then zipped again.
I think the unzipping process is altering the relative links.
Bottom line is, these tests are critical in my school of 1,274 children mitigating the impact of COVID-19 lockdown.
The unzipping process is not the problem, but the zipping is.
When you upload the file, H5P is complaining because it expects some flags to be set when zipping:
-D do not add directory entries
-X eXclude eXtra file attributes
I assume that at some point your script is calling zip. That call would need to pass the correct flags. On a command line, you'd use
zip -rDX myNewFile.h5p *
to pack all files in the current directory into a valid H5P content file named myNewFile.h5p. Just "translate" that into your script.

Possible to load one gzip file of muliple CSVs in Redshift

I am trying to load a compressed file which contain multiple CSV files into Redshift. I followed AWS documentation Loading Compressed Data Files from Amazon S3. However, I am not sure if I will be able to do following:
I have multiple CSV files for a table:
table1_part1.csv
table1_part2.csv
table1_part3.csv
I compressed these three files into one table1.csv.gz.
Can I load this gzip file into Redshift table using COPY command?
No you cannot; but using copy command you can give a folder name (containing all zip files) or wild card .. So just dont zip them into one file but independent files will work fine.
You could achieve by creating a Menifest file that should have link of all your CSV files and just specify the Menifest file in your copy command like-
copy customer
from 's3://mybucket/cust.manifest'
iam_role 'arn:aws:iam::0123456789012:role/MyRedshiftRole'
manifest;
See Menifest at end.
For more details refer Amazon Red-Shift Documentation. Section "Using a Manifest to Specify Data Files".

Use of zip files as Themes in Resource files

I have this bulk load of html, js, css, less files including zip files (themes) to be placed in Resource file in Lotus Notes. Will it be able to view get the zip files? There are so many files in the themes folder and it's going beyond the allowable file path so I wanted it to be placed in zip file.
If you put a zip file in resources, then it will be served as a zip to browsers. So that's not the solution. You need to unzip it and add all files.
You can also put the unzipped files in the default HTML folder on the Domino server without adding them to database resources. On Linux, it's usually /local/notesdata/domino/html/ and on Windows C:\data\domino\html.

Import csv with paths to pdf files

I have a custom content type (product) containing a title, some text and a file (a pdf file).
A lot of products have to be imported to the Drupal CMS. I discovered the plugin "Feeds" which seems to fulfill my requirements.
I managed to import a csv files containing a title and some text.
Now in my csv file I have a column containing a path to all the pdf files (like C:\mypdfs\product1.pdf). How can I manage that those pdf files are imported by Feeds? In "Mapping" configuration I'm not sure what target I have to select for those column which contains the path to a local file.
Using Feeds Tamper module you can manipulate the value of imported data for one target. Here you can build a custom tamper (See How) and use it to process the retrieved value (file path), using file_get_contents to get the file from imported path and file_save_data to save a file in Drupal, getting a field object that you can attach to an entity (this could help).

turn folder into a zip file

I have a folder on my desktop and i want to convert it into a .zip file. It shouldnt ask me were to save it but just save it straight to my desktop or any folder i specified.
I tried ASZip, fZip etc. but i can't get it to work. There isn't any of them that seem to let me just add a folder and zip it.
I was only able to create a byteArray wit ASZip but when i saved it, it left me with a file that was not able to be opened.
Would it be possible to achieve what i want without the use of an external library?
Any help would be appreciated.
You can't actually zip a folder but you can zip all the contents inside the folder. You would have to use FileReferenceList to load in flash all the files inside the folder.
FileReferenceList allows you to have multiple selection in the browse window.
Then you would have to pass all these files to the zip managing library and get a ByteArray from it.
This byte array you would localy dump inside a "yourFileName.zip" by using FileReference.save().
The application cannot save the file to a predetermined location. The user has to pick the location using the "save to" prompt.