MySql for Excel - Imported order error - mysql

I'm using MySQL for Excel plug-in (1.3.7 version) to import data from my MySQL database into my excel however, excel is only changing the order of the columns (alphabetical order) while the data remain in the same order.
The data rows appear in the order I want, but the header row is wrong!
For example:
If my table is (in MySQL Workbench):
id | Name | Date_of_birth
01 | Test1 | 01/01/2001
02 | Test2 | 02/02/2002
Excel Tries to import as:
Date_of_birth | id | Name ---> (ALPHABETICAL ORDER)
01 | Test | 01/01/2001
02 | Test2 | 02/02/2002
Because the "Name" column is a varchar(100), it does not accept DATE type values below it.
So, I can not import anything into my excel.
The only way that I've found to solve my problem is to put my table in alphabetical order (inside the MYSQL Workbench). But it is very inefficient and I don't want to do that with every table.
Could you help me?
Thank you very much for your attention.

If you are copying and pasting, try using the "text to columns" button in Excel under the Data tab.
Excel shouldn't be sorting these automatically. Start with a blank worksheet if you can and see if you have the same problem.
Otherwise, please post how you are moving the data from Workbench to Excel. It's likely that is the problem.

Got stuck on this for a while. I am surprised I could not find more complaints about this issue.
Deselcting the mysql addin, restarting excel and then reselecting the addin did the trick for me.
Mysql addin
File->options-> addins- manage->com add ins ->go

Related

Access 2016 prevent double loading of data

My Setup:
I have a decently large table where each record should be all sales for a specific store for that day.
For example the records look roughly like:
Location | Date | Sales | etc.
Store 1 | 1/29/2018 | $20 | etc.
Store 2 | 1/29/2018 | $5 | etc.
Store 1 | 1/30/2018 | $25 | etc.
Store 2 | 1/30/2018 | $10 | etc.
In short you should NEVER have the same store on the same day more than once.
What's the best way to check this? Can I do data validation on my records (i'm assuming no because my understanding is it won't check vs the loaded data), or do I need to write something in VBA (i'm currently using canned saved imports but if it's a must I can write something).
I have an automated daily append to the table, but occasionally things get messed up and stripping out a days worth of duplicate data manually is obviously not ideal.
My original answer was:
Access can help you to detect those duplicates stores and days easily
with the query assistant. Just design a "search for duplicates" query,
using as criteria the fields you don't want to be repeated (in your
question, I understand those fields are Location and Date
OP tried and said:
Yeah it works. Really just easier to handle by importing to a temp
table and then using a query to check it for duplicates before loading
as opposed to arcane data validation rules
So OP could resolve the problem by importing the data to a temp table, and then using the "check for duplicates" query, before loading the data to non-temp tables.

Implementing condition in SSIS

I'm importing data from txt to Sql Server table. That part works good.
Everyday this txt file is being deleted and new txt file is formed (i.e. yesterday there was data for 3 February, today for 4 February (column Date)).
When I run package, I want it to check whether Date column exists in database table. If it exists, skip, don't import, if it doesn't - import. And I want to save that Date value in a variable for further manipulations. How can I accomplish that?
we suppose you have your source file with the format and data as bellow
id | product | dateLoad
1 | dell | 25-01-2016 16:23:14
2 | hp | 25-01-2016 16:23:15
3 | lenovo | 25-01-2016 16:23:16
and your destination have the format as bellow
create table stack(id int,product varchar(20),dateLoad smalldatetime);
In your SSIS add a Derived Column to convert the smalldatetime to date like this :
Secondly add a Lookup in General Tab in your Lookup transformation Editor go to Specify how to handle rows with no matching entries and select Redirect rows to no match output. In Connection Tab add a connection to target table and i wrote a Sql query to convert the smalldatetime to date show the picture as bellow :
In Column tab do this :
Finally add a connection with the lookup and your target table and select Lookup no matching output
In the first execution i have 3 rowsinserted because i don't have the date in my table
I execute another time but i had 0 rows because i have the date in my table
I hope that help you

Exporting Data from Cassandra to CSV file

Table Name : Product
uid | productcount | term | timestamp
304ad5ac-4b6d-4025-b4ea-8b7991a3fe72 | 26 | dress | 1433110980000
6097e226-35b5-4f71-b158-a1fe39a430c1 | 0 | #751104 | 1433861040000
Command :
COPY product (uid, productcount, term, timestamp) TO 'temp.csv';
Error:
Improper COPY command.
Am I missing something?
The syntax of your original COPY command is also fine. The problem is with your column named timestamp, which is a data type and is a reserved word in this context. For this reason you need to escape your column name as follows:
COPY product (uid, productcount, term, "timestamp") TO 'temp.csv';
Even better, try to use a different field name, because this can cause other problems as well.
I am able to export the data into CSV files by using by below command.
Avoiding the column names did the trick.
copy product to 'temp.csv' ;
Use following commands to get the data from Cassandra Tables to CSV
This command will copy Top 100 rows to CSV file.
cqlsh -e"SELECT * FROM employee.employee_details" > /home/hadoop/final_Employee.csv
This command will copy All the rows to CSV file.
cqlsh -e"PAGING OFF;SELECT * FROM employee.employee_details" > /home/hadoop/final_Employee.csv

MySQL varchar column filled but not visible

I'm having a problem with a column ( VARCHAR(513) NOT NULL ) on a MySQL table.During a procedure of import from a CSV file, a bunch of rows got filled with some weird stuff coming from I don't know where.
This stuff is not visible from Workbench, but if I query the DBMS with:SELECT * FROM MyTable;I got:ID | Drive | Directory | URI | Type ||
1 | Z: | \Users\Data\ | \server\dati | 1 || // <-correct row
...
32 | NULL | \Users\OtherDir\ | | 0 ||While row 1 is correct, row 32 shows a URI filled with something. Now, if I query dbms with:SELECT length(URI) FROM MyTable WHERE ID = 32; I got 32. While, doing:SELECT URI FROM MyTable WhERE ID = 32; inside a MFC application, gets a string with length 0.Inside this program I have a tool for handling this table but this cannot work because I cannot build up queries about rows with bugged URI: how can I fix this? Where this problem comes from? If you need more information please ask.
Thanks.
Looks like you have white spaces in the data and which is causing the issue and when you import data from CSV its most often happen.
So to fix it you may need to run the following update statement
update MyTable set URI = trim(URI);
The above will remove the white spaces from the column.
Also while importing data from CSV its better to use the TRIM() for the values before inserting into the database and this will avoid this kind of issues.

Selecting the most popular keyword in a mysql database

I have a column called keywords where users enter up to 4 keywords separated by a coma, ie:
----------------------------------
userId | kewords |
----------------------------------
01 | php,css,html,mysql |
02 | wordpress,css,drupal,xx |
03 | mysql,html,wordpress,css|
----------------------------------
I'm trying to figure out a query to select all the keywords from everyone, explode them by the coma and then count how many there are of each.
I know I can do this quite easily with PHP but I though there might be a way for mysql to do it...
Any ideas?
Try to normalize the data, ie store 4 rows instead of one for each user.
It also possible to split a string into a temporary table but I'm not sure that will help you much. Originally I found this source on mysql forge but that has been shut down so here is a similar code
http://www.pnplogic.com/blog/articles/MySQL_Convert_Delimited_String_To_Temp_Table_Result_Set.php