Rename the column components(bulk) in SQL - mysql

I retrieved a data which has chemical compounds and related data in CSV. I am using SQL for sorting and retrieving the data that I want and I am very new to this.
The issue I am facing is that the compounds are named as compund_1, compound_2, and I want to rename them to their respective compound name( e.g.: compound_1 is Nicotine, compound_2 is Aspirin).
The issue is there are over 5500 of these and I'd like to know if it's possible for me to replace the names in bulk.

you can create a table with columns
required to store chemical compounds and related data.
import the csv file to this table.
create another csv file with compounds column and their names (asprin...).
import this csv file into the table. that will replace all compound_1 ... with their names.

Related

How can I do an update import in Quick Base from a csv without importing null values?

I need a way to update a Quick Base table with a csv update import, but I don't want to over-write existing data in the Quick Base table with null values in the csv. I want to only import the non-null data from the csv.
I would like to do regular updates of a Quick Base table by uploading a CSV. However, my csv will only include values for data that is changing from the existing records. Most of the values in the csv will be null.
How are you uploading the csv?
The most important thing to know is that you need to provide record IDs for each item in your csv to match them with existing records, otherwise Quickbase writes new records. You also need to format your clist properly. So...
You can narrow down what will be written by including Record IDs in your csv rows. If your csv contains Record IDs then only the matching Record IDs will be updated. You can optionally use mergefield instead of record ID. This is a new feature as of a few months ago.
If you need to update individual fields per record and exclude other fields, then specify only the fields that should get updated in the clist, ex: clist='3.7.11.31' (include record id so that it can match the existing records).
If you will have a mix of records and fields where some are supposed to be updated, and some aren't then you could go through the csv to import with something like javascript. For example: you can do API_DoQuery with a clist that matches your csv's clist (or 'a' for all). Let the results of that API call be the csv you will upload later, then update that csv with the csv you want to upload. That way you'll have a copy of the data from Quickbase, and then with the updated values in your csv, do API_ImportFromCSV.
see the documentation on using the API call for more help.

Add names to tables in Tableau

I'm new to Tableau and now have a CSV file without column names. All data are stored in this CSV file. There is another CSV file contains all column names. I am trying to add these names to the unnamed data file. Is there a way to do it?
I know I could open the CSV file and copy the name manually, but I wish I could operate it in the Tableau.
Thanks.
You can connect to a CSV file that does not have a header row, and then name the fields yourself in Tableau when editing the data connection.
Click on the settings (gear shaped) icon on the right side of the table in the data source pane.
Then specify that there is no header row and that Tableau should generate names for the fields.
Then rename the fields from their generated names like F2, F2 etc to something meaningful, by clicking on the little black triangle icon next to the generated field names and choosing rename.

Extracting column names from several CSV files programming

I have 40 csv files. All the files have different column names. I want a list of column names of each csv in a table format(in csv or in excel). The new file should contains list of column names from each csv file and corresponding csv file name.
I am doing it manually for now but if the number of files increases, it will become problematic. I want to do it using code.
Note: It can be a very trivial thing but I am not a techie so please help.

Importing excel file to access and set up columns field name

I'm having an access tool where I'm importing an excel file with table information. The system is creating a new table with this info with column fields (F1,F2,F3, etc.) and under it there is 10 lines with data and after that a table. I need the information from this table to be appended in another table in Access. I'm having the code and the append query, but sometimes some of the columns in excel file are change their places and this is a problem for my table 2. I would like to ask you is it possible somehow to change automatically the nameing of the column fields in the first table when I'm importing the info from the excel sheet.
Thank you in advance! - Here is a screenshot. The yellow one to go to the grey one.

ACCESS: Truncation error when appending CSV data to tables?

I am currently experiencing difficulties when trying to append data to existing tables.
I have about 100 CSV files that I would like to create a single table from; all the tables have different column structures but this isn't really an issue as the associated field names are in the first row of each file.
First, I create a new table from one of the files indicating that my field names are in the first row. I change the particular fields that have more than 256 characters to memo fields and import the data.
I then add to the table the fields that are missing.
Now, when I try to append more data, I again select that my field names are in the first row, but now I receive a truncation error for data that is destined for the memo fields.
Why is this error occurring? Is there a workaround for this?
edit
Here is an update regarding what I've attempted to solve the problem:
Importing and appending tables will not work unless they have the exact same structure. Moreover, you cannot create a Master table with all fields and properties set, then append all tables to the master. You still receive truncation errors.
I took CodeSlave's advice and attempted to upload the table, set the fields that I needed to be Memo fields, and then append the table. This worked, but again, the memo fields are not necessarily in the same order in every data file, and I have 1200 data files to import into 24 tables. Importing the data table by table is just NOT an option for this many tables.
I expect what you are experiencing is a mismatch between the source file (CSV) and the destination table (MS Access).
MS Access will make some guesses about what the field types are in you CSV file when you are doing the import. However, it's not perfect. Maybe it's seeing a string as a memo or float as a real. It's impossible for me to know without seeing the data.
What I would normally do, is:
Import the second CSV into it's own (temporary) table
Clean up the second table
Then use an SQL query to append those records from the second table to the first table.
Delete the second table
(repeat for each CSV file you are loading).
If I knew ahead of time that every CSV file was already identical in structure, I'd be inclined to instead concatenate them all together into one, and only have to do the import/clean-up once.
Had a very similar problem - trying to import a CSV file with large text fields (>255 chars) into an existing table. Declared the fields as memo but were still being truncated.
Solution: start an import to link a table and then click on the Advanced button. Create a link specification which defines the relevant fields as memo fields and then save the link specification. Then cancel the import. Do another import this time the one you want which appends to an existing table. Click on the Advanced button again and select the link specification you just created. Click on finish and the data should be imported correctly without truncation.
I was having this problem, but noticed it always happened to only the first row. So by inserting a blank row in the csv it would import perfectly, then you need to remove the blank row in the Access table.
Cheers,
Grae Hunter
Note: I'm using Office 2010