Couchbase: How to copy data from bucket to another? - couchbase

I have a bucket (Bucket1) that I need to delete, and move its documents to another bucket (Bucket2).
What is the best way achieving this?
I'm using version 4.5.

The cbtransfer tool should be able to do the data transfer you need.
https://docs.couchbase.com/server/6.0/cli/cbtransfer-tool.html
bucket-delete can be used to delete a bucket once you are finished with it.
https://docs.couchbase.com/server/6.0/cli/cbcli/couchbase-cli-bucket-delete.html

Related

Autodesk-Forge bucket system: New versioning

I am wondering of what is the best practise for handling new version of the same model in the Data Management API Bucket system
Currently, I have one bucket per user and the files with same name overwrites the existing model when doing a svf/svf2 conversion.
In order to handle model versioning in be the best manner, should I :
create one bucket per file converted
or
continue with one bucket per user.
If 1): is there a limitation of number of buckets which is possible to create?
else 2): How do I get the translation to accept an bucketKey different than the file name? (As it is now, the uploaded file need to be the filename to get the translation going.)
In advance, cheers for the assistance.
In order to translate a file, you do not have to keep the original file name, but you do need to keep the file extension (e.g. *.rvt), so that the Model Derivative service knows which translator to use. So you could just create files with different names: perhaps add a suffix like "_v1" etc or generate random names and keep track of which file is what version of what model in a database. Up to you.
There is no limit on number of buckets, but it might be an overkill to have a separate one for each file.

Rename Bucket or transfer all models

I would like to know if it is possible to rename a Bucket.
If not, I would like to know if I can move all my models on the bucket I want to rename to a new bucket without translating each model again.
Thanks.
Unfortunately, it is not possible to rename a bucket, but it is possible to copy files (objects) across bucket with this API
For the viewables, it is a different story - they are not stored in OSS buckets, but on the Model Derivatives server. It means, you either need to translate them again if you want to use the new URN, or leave them where they are and map the old and new URNs. Viewables are destroyed only when you delete their manifest.

Data Masking on huge CSV files stored in AWS S3

I have huge csv files of size ~15GB in aws s3 (s3://bucket1/rawFile.csv). Lets say if the schema looks like this:
cust_id, account_num, paid_date, cust_f_name
1001, 1234567890, 01/01/2001, Jonathan
I am trying to mask the account number column and the customer name and create a new maskedFile.csv and store it in another aws s3 bucket (s3://bucket2/maskedFile.csv) as follows:
cust_id, account_num, paid_date, cust_f_name
1001, 123*******, 01/01/2001, Jon*******
This needs to be done just once with one snapshot of the payment data.
How can i do this? and what tools should I use to achieve this? Please let me know.
AWS Glue is AWS' managed ETL and data catalog tool, and it was made for exactly this kind of task.
You point it to the source folder on S3, tell it the destination folder where you want the results to land, and you are guided through the transformations you want. Basically if you can write a bit of Python you can do a simply masking transform in no time.
Once that's set up, Glue will automatically transform any new file you drop into the source folder, so you have not only created the code necessary to do the masking, you have a completely automated pipeline that runs when new data arrives. I saw that your case only calls for it to run once, but it's really not much easier to write the code to do it once.
To see an example of using Glue to set up a simple ETL job, take a look at: https://gorillalogic.com/blog/in-search-of-happiness-a-quick-etl-use-case-with-aws-glue-redshift/. And there are plenty of other tutorials out there to get you started.
You can try FileMasker.
It will mask CSV (and JSON) files in S3 buckets.
You can run it as an AWS Lambda function although the Lambda restrictions will limit the input file sizes to a few GB each.
If you can split the input files into smaller files then you'll be fine. Otherwise, contact the vendor for options. See https://www.dataveil.com/filemasker/
Disclaimer: I work for DataVeil.

Retrieving json file from Firebase Storage

I would like to be able to get a json file from Firebase Storage to work with. I don't need to push it back.
Does Firebase Storage can give me that and if so, how can I do it with Angular?
If not, does the Firebase database is a better choice?
I already downloaded firebase and angularfire2.
I tried getDownloadURL and getMetadata, but don't know how to get the info inside.
Thanks for your help!
I'd suggest you use the realtime database, I suspect that it will be simpler for you to pick up. The realtime database already stores all of it's information as JSON, so you could simply query the database and get your JSON data... but if you used the Storage portion of Firebase then you'd have to jump through some hoops to actually read the file you get and parse out the data.
Check out this documentation for examples of how to query the realtime database using Angularfire.

Is it possible to save data to a json file on local disk using $resource.save without using any server side implementation?

I am trying to build an Employement Management App in AngularJS where a basic operation is adding an employee.
I am displaying details using a service and getting Json data from the mock json file i am using.
Similarly, can i add form data to a textfile on the harddisk.
I have seen it done in a tutorial using $resource.save.
If at all it is possible without any server side code please share the example,it would be helpful.
Thanks.
You can make use of HTML5 local browser storage, as this does not require folder access. Mode datails here http://diveintohtml5.info/storage.html
AngularJS has modules for local storage which you can use to access such storages like this https://github.com/grevory/angular-local-storage