MySQL Query or Excel 2010 - Which is the better way of arranging data for reports? - mysql

Info: Server version: 5.1.39 - MySQL / phpMyAdmin
Php: 5.4
Server: Apache
Code is run via: Server SQL Query (copy & paste in the phpMyAdmin) or in MySQL Workbench or using a custom shopping cart manager.
Exports to: Excel 2010 (.csv then to .xlsx for sales reports)
Hi there,
This is more a personal question, rather than a technical, however the answer should have a technical component.
Which is better? Run a full query (with calculations) in MySQL, export to Excel and work on that data OR Run a basic query and then run the calculations in Excel?
The scenario:
My sales reports are in MySQL from my online store, each month I export the data that I need and create reports in Excel with additional information. (Info not in the store).
My query has calculations, both for profit/loss & dates. I can do this in excel as well.
I am not an expert in either field, barely scraping by with help from people such as the community here and other sites, however I'd like to read some feedback from those that work with both.
I currently use things like (in addition to column calculations):
DATE_FORMAT(T5.date_purchased, '%Y-%m-%b') As OrdMonth,
DATE_FORMAT(T5.date_purchased + INTERVAL 1 MONTH, '%Y-%m-%b') As PayMonth,
concat(date_format(T5.date_purchased + INTERVAL 1 MONTH, '%y%m'), '-', T4.manufacturers_id) As RepID,
DATE_FORMAT(T5.date_purchased + INTERVAL 1 MONTH,'%y%m') As BatchID,
Which I can do in Excel with similar formulas.
So, is it better to have the report generate from the server as I want it in Excel, or just get the barest of info & run the calculations after?
Thank you in advance for your knowledge & input.

If the target user is any typical viewer of a web page, then use PHP+SQL.
If the target user is you, then get the data out as quickly as you can, then massage it with Excel.
Excel is more flexible, and functions + VBA will take you farther and faster than SQL.
In order to go more in depth with the answer I need more details.
EDIT
Here is the solution that I use very often: it is not professional, but when it's for personal use it's the fastest way:
Open both the csv file with the data to analyze and the Excel file with a VBA macro called CreateReport.xmls
Run the macro
Here is the description of the steps executed by the macro:
Scan all the open workbooks looking for the csv file (recognizing it for example by some header that is always there)
Add columns with formulas
For each report:
Duplicate the sheet
Delete the rows or columns that you don't need
Apply the formatting
If you don't know VBA there are many resources out there to get started. You can come back here when you have more specific questions.

Related

Google Data Studio report using Cloud SQL MySQL allows only one table

I'm using Data Studio to generate a financial report dashboard and I'm connecting it to CloudSQL MySQL, but my problem here is that it only requires me one table to use as a data source, and one table wouldn't help me at all to generate a financial report at all.
Here's the image of the process of selecting a Data Source:
I tried selecting Custom Query, which according to this: https://support.google.com/datastudio/answer/7088031?hl=en
Select the CUSTOM QUERY option to provide a SQL query instead of connecting to a single table. Google Data Studio uses this custom SQL as an inner select statement for each generated query to the database.
But I don't know what query should I write to have all my database tables as data sources in Google Data Studio.
Regarding Custom Queries: Had a look online, and didn't seem to find a sample CUSTOM QUERY specific to Google Data Studio and Google Cloud SQL for MySQL, however, on StackOverflow, there are a couple of posts on BigQuery Custom Queries that involve joins, that may be useful:
Data Studio query error when using Big Query view that joins tables
BigQuery Data Studio Custom Query
An alternative is to create individual Data Sources each linked to a single table and then link multiple Data Sources through the use of Data Blending, where a common Join Key links all the respective Tables.
In addition, if you could elaborate on your exact scenario, it would perhaps help users familiar with SQL to provide a more precise solution:
How are the tables structured?
How are the tables linked?
What code have you currently tried?
I also had quite a few issues with the Custom Query using the Cloud MySQL Connector by Google for Data Studio.
The resolution for me was to not run SELECT * but rather SELECT each column by name. Not sure why it doesn't like SELECT * but hopefully this helps someone else.
Example of a successful query.
Example of successful query with join.

SSDT - columns out of synchronization - blocks update to table

Process - first off, i would like to just use a program to enter data into the SQL table but due to certain ... issues ... this is not an option.
Excel sheet has set column headers - they do not change - excel data for columns does change. new records are added and other are removed daily. i have a project in SSDT 2015:
clear temp table - delete from tbldatemp
data flow task
a. source: excel sheet - access mode: table or view
b. Data conversion 0-0 - some conversions necessary for sql to accept data
c. destination - tbldatemp
update main tblda - insert into where not exists a.[blah] = b.[blah] etc etc etc
so every day the data in excel changes. i have a vb.net program that executes the update package with the click of a button - and every day it doesn't work. so i open up the project in SSDT and see a nice yellow ! triangle on my data flow task stating that columns are out of synchronization. when i open up the data flow task, the ! is now on the excel source stating columns are out of synchronization.
i have looked for DAYS trying to figure out how to fix this or get around it and cannot figure it out. please help!!!! Thank you

How can I copy-and-paste Data into MySQL Workbench table?

I've used MySQL Workbench and MS SQL Server Management Studio off & on over the years. The one thing I enjoy with SSMS is the ability to copy information from an Excel document and paste it into the results pane of SSMS. Is there a similar means of importing information into tables for MySQL Workbench? This method is quick and easy. Right now the way I do it for MySQL is I export from excel to CSV file, then import to MySQL from CSV file. Thanks!
There's no way to insert an entire Excel table into a result set in MySQL Workbench using only the clipboard. For this task your approach via a CSV file is the best one. There's an option to copy/paste entire rows, though, which might be of help for small changes.
Third Party Tool with CSV
If you are willing to use a third party tool, you can paste from an Excel worksheet select-copy or a CSV open in a text editor into this tool to generate INSERT INTO or UPDATE statements. It can also add the CREATE TABLE with appropriately sized VARCHARs as an option (among many other output options). You copy and paste from the tool directly into a blank SQL page in mySQL Workbench.
Third Party Tool with Regex
Alternatively, you can use a regular expression tool such as https://regex101.com to generate the SQL code. For example, a three field csv that's tab separated can be decoded by this regex
^([^\t]*)\t([^\t]*)\t([^\t]*)$
and then in the Substitution expression, this
INSERT INTO `myschema`.`mytable` SET `first_field`='$1', `second_field`='$2', `third_field`= TRIM('$3'), `outbox_sent_text` = 'Friday, December 04, 2020 12:28 PM', `inbox_received_text` = 'Fri, 4 Dec 2020 15:43:08 +0000' AS `NEW` ON DUPLICATE KEY UPDATE id=id;
Note this substitution uses the updated INSERT...ON DUPLICATE KEY UPDATE method that will replace the deprecated VALUES syntax
Copy the block of INSERTs into an empty Workbench SQL window, and run the query.
Third party not OK? Not a problem, works locally too
If sending this through a web site is not allowable, the same regex substitution can be done with a local script or local text editing tool (Notepad++, etc.).

Import Excel to SQL Server 2008

I need to create a process to import a multi tabbed excel spreadsheet into SQL Server 2008R2. Each tab will be a different table in the database. This will need to be done weekly and imports should be automated. Ideally I want to pop the spreadsheet into a folder [or have some intern do it] and have sql run a procedure that looks in this folder, and adds the data to the tables in this db. I would also like to have another table that tracks the imports and date stamps them. I really have no idea where to even start here as I'm a pretty huge noob when it comes to tsql.
There is a nice article by microsoft - http://support.microsoft.com/kb/321686 - that outlines the processes involved.
The process is simply
SELECT * INTO XLImport3 FROM OPENDATASOURCE('Microsoft.Jet.OLEDB.4.0',
'Data Source=C:\test\xltest.xls;Extended Properties=Excel 8.0')...[Customers$]
Where XLImport3 is the table you want to import into and the datasource is the excel sheet you want to import from.
If you're limited solely to TSQL, the above two answers will show you some ideas. If you have access to either Data Tools or Business Intelligence, with SSIS, you can automate it with the assumption that each sheet in the Excel workbook matches each time. With SSIS, you'll use a Data Flow task and each sheet will be imported into the table that you want. When you're ready for the file the next week, you'll drop it into the folder and run the SSIS package.
However, if the sheet names change, (for instance, one week sheets are called Cats, Dogs, Rain and the next week it's Sulfur, Fire, Hell) then this would cause the package to break. Otherwise, if only the data within the worksheet change, then this can be completely automated with SSIS.
Example article: https://www.simple-talk.com/sql/ssis/moving-data-from-excel-to-sql-server---10-steps-to-follow/
Below is the code to insert data from a csv file into a given table. I don't what the full requirements are for the project, but if I were you I would just separate each table into a different file and then just run a proc that inserts data into each of the tables.
BULK
INSERT TABLE_NAME
FROM 'c:\filename.csv'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
insert into import_history ('filename', 'import_date') values ('your_file_name', getdate())
Also, for the table that tracks imports and timestamps them, you could just insert some data into that table after each bulk insert as seen above.
Also, here's a link to tutorial on bulk inserting from a csv file that may also help: http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Its very simple. Right click the Database in Sql Server(2008), select Tasks and select Import Data
Now change the DataSource to Microsoft Excel. Chose the path of Excel file by clicking Browse button and click Next.
Chose the Sql Server instance and chose the database to which the excel to be imported.
Select Copy data from one or more tables or views and click Next.
Now select the sheets to be imported to Sql Server.
Click Next
Now click Finish
Now the wizard imports the data from Excel to Sql Server and click Close.
Here is the table

Query MySQL data from Excel (or vice-versa)

I'm trying to automate a tedious problem. I get large Excel (.xls or .csv, whatever's more convenient) files with lists of people. I want to compare these against my MySQL database.*
At the moment I'm exporting MySQL tables and reading them from an Excel spreadsheet. At that point it's not difficult to use =LOOKUP() and such commands to do the work I need, and of course the various text processing I need to do is easy enough to do in Excel.
But I can't help but think that this is more work than it needs to be. Is there some way to get at the MySQL data directly from Excel? Alternately, is there a way I could access a reasonably large (~10k records) csv file in a sql script?
This seems to be rather basic, but I haven't managed to make it work so far. I found an ODBC connection for MySQL but that doesn't seem to do what I need.
In particular, I'm testing whether the name matches or whether any of four email addresses match. I also return information on what matched for the benefit of the next person to use the data, something like "Name 'Bob Smith' not found, but 'Robert Smith' matches on email address robert.smith#foo".
You can use ADO and SQL. This example is an insert query, but any query will work:
Excel VBA: writing to mysql database
Why don't you load your CSV data into a dedicated table and perform your searches using MySQLs functions?You could even do the logic from within excel (VBA or dotNET, depending on release)
No matter what you do, you will have to write a bunch of code, if you wan't to detect Robert Smith...