One way to import csv file is by using copy command in cqlsh. I am wondering is there an effective way to import csv file from DevCenter ?
Sorry, there currently is no way to import from CSV using DevCenter.
Related
I want to read a single CSV file in a google bucket with pyarrow. How do I do this?
I can create a FileSystem object with gcsfs, but I don't see a way to provide this to pyarrow.csv.read_csv.
Do I need to create some sort of file stream from the file system? What's the best way to do this?
import gcsfs
import pyarrow.csv as csv
fs = gcsfs.GCSFileSystem(project='foo')
csv.read_csv("bucket/foo/bar.csv", filesystem=fs)
TypeError: read_csv() got an unexpected keyword argument 'filesystem'
Using pyarrow version 6.0.1
I'm guessing you are working with this doc. You're correct that the approach listed there does not work with read_csv because there is no filesystem parameter. We can still generally do this but the process is a bit different.
Pyarrow has its own filesystem abstraction. If you have a pyarrow filesystem then you can first open a file and then use that file to read the CSV:
import pyarrow as pa
import pyarrow.csv as csv
import pyarrow.fs as fs
local_fs = fs.LocalFileSystem()
with local_fs.open_input_file('foo/bar.csv') as csv_file:
csv.read_csv(csv_file)
Unfortunately, a gcsfs.GCSFileSystem is not a "pyarrow filesystem" but you have a few options.
The method gcsfs.GCSFileSystem.open can give you a "python file object" which you can use as input to pyarrow.csv.read_csv.
import gcsfs
import pyarrow.csv as csv
fs = gcsfs.GCSFileSystem(project='foo')
with fs.open("bucket/foo/bar.csv", 'rb') as csv_file:
csv.read_csv(csv_file)
looking for a way to write/convert .csv data to mf4 format programatically.
I have done it singularly using IPEmotion.
I have checked out https://www.turbolab.de/mdf_libf.htm and opened up their code listed on that site. If anyone has even used this before I would be grateful for advice. I primarily use LabVIEW, but am open to python/c++/C# solutions.
You could load the csv using pandas and append it to a MDF object using this lib https://asammdf.readthedocs.io/en/latest/api.html#mdf4 (see the append method)
from asammdf import MDF
import pandas as pd
df = pd.read_csv('input.csv')
mdf = MDF()
mdf.append(df)
mdf.save('output.mf4')
How can I import my excel file to php myadmin without converting it to csv format, considering the excel file is too large.
got my answer...and it worked perfectly.
answer- http://itsolutionstuff.com/post/php-import-excel-file-into-mysql-database-tutorialexample.html
As the title goes , I wonder if MongoDB has a data file format to import directly ?I know that mysql has "sql" file format for it to import directly .I am now in a project has the same requirement.Any one can tell me ?
MongoDB can import data using the mongoimport tool from JSON, CSV and TSV data format as you can see here
MongoDB internally represents data as a binary-encoded JSON (BSON), so importing and exporting in JSON format is really fast and intuitive
Of course.Mongodb use mongodump/mongoexport to export the data to outside files and use mongorestore/mongoimport to import data to its databases.more details , just reference to mongodb doc.mongodump and mongoexport ,mongorestore and mongoimport ,do have some differences .More details ,please refer to mongodb doc.
I need to import more than 280 000 records to mysql (sqlite also fine). I have the xlsx format only. I couldn't convert this to xls file. Is there way or import option in latest version or any better tool available. thanks in advance.
See if you can convert the xslx file to csv format then follow this post: Import Excel Data into MySQL in 5 Easy Steps
You can use Data Import tool in dbForge Studio for MySQL.
Open Data Import wizard, select 'MS Excel 2007' format, specify other options and press Import.