MySQL query using Excel - mysql

I am writing a MySQL script in VBA to extract data from my database.
Is it possible to pass a list of criteria to a MySQL statement in excel ??
For example lets my script (using VBA) is as follows
select * from farms where date = '20180101' and
(animals like 'cat',
or animals like 'dog'
or animals like 'horse')
Now lets say i have 200 animals and they are in an excel list or in a spreadsheet , is it possible to reference the list (like a lookup) instead of writing a very lengthy script ??

Related

Pivot Table in Adwords Script Alernative

Manipulating Pivot table in Google Adwords Script is not yet supported. Is there any script or project alternative for this?
You can generate a new tab, and insert in any cell formula with a Google Query like that:
=QUERY(IMPORTRANGE("1fSdx7f3rg_Vp_11yFuqKZmHraqFit8A", "Headline1!A1:J"), "select Col1, sum(Col4), sum(Col5), sum(Col7), sum(Col8) group by Col1", 1)
, where you specify the source of the data, the query and a headers parameter.
Note, that the columns should be named like this: Col1,.Col2 etc. So the columns order should be strict (if you want to generate a report regularly).
Google Visualization API Query Language (used in the "query" parameter) is almost the same as SQL, but with some limitations on commands that can be used.

Ms Access Data Macro Return Record Set

I am using a Data Macro for one of my Access table's after insert event. Inside this Data Macro, I am using SetLocalVar to call one of my functions written in vba to insert the same inserted record set into my SQL Database. This is how the Expression in SetLocalVar currently looks like;
AfterInsertMacro("TblLotStatus",[LotStatusID], [NextStatus], [Status], [Printed], [Active], [AvailableForPull] )
Now, instead of returning back every column as a separate variable, can't I return a record set or a collection or an array etc. ?
Basically I am looking for something like this (written in C# language);
AfterInsertMacro("TblLotStatus", new object[]{[LotStatusID], [NextStatus], [Status], [Printed], [Active], [AvailableForPull]} )
Or any kind of expression that would look like;
AfterInsertMacro("TblLotStatus", [Inserted Recordset])

Using Google BigQuery to run multiple queries back to back

I'm currently working a project where I'm using Google Big Query to pull data from spreadsheets. I'm VERY new to SQL, so I apologize. I'm currently using the following code
Select *
From my_data
Where T1 > 1000
And T2 > 2000
So keeping the Select and From the same, I want to be able to run multiple queries where I can just keep changing the values I'm looking for between t1 and t2. Around 50 different values. I'd like for BigQuery to run through these 50 different values back to back. Is there a way to do this? Thanks!
I'm VERY new to SQL
... and I assume to BigQuery either ..., so
Below is one of the options for new users who are not familiar yet with BigQuery API and/or different clients rather than BigQuery Web UI.
BigQuery Mate adds parameters feature to BigQuery Web UI
What you need to do is
Save you query as below using Save Query Button
Notice <var_t1> and <var_t2>
Those are the parameters identifyable by BigQuery Mate
Now you can set those parameters
Click QB Mate and then Parameters to get to below form
Now you can set parameters with whatever values you want to run with;
Click on Replace Parameters OK Button and those values will appear in Editor. For example
After OK is clicked you get
So now you can run your query
To Run another round with new parameters, you need to load again your saved query to the editor by clicking on Edit Query Button
and now repeat settings parameters and so on
You can find BigQuery Mate Chrome extension here
Disclaimer: I am the author an the only developer of this tool
You may be interested in running parameterized queries. The idea would be to have a single query string, e.g.:
SELECT *
FROM YourTable
WHERE t1 > #t1_min AND
t2 > #t2_min;
You would execute this multiple times, where each time you bind different values of the t1_min and t2_min parameters. The exact logic would depend on the API through which you are using the client libraries, and there are language-specific examples in the first link that I provided.
If you are not concerned about sql-injection and just want to iteratively swap out parameters in queries, you might want to look into the mustache templating language (available in R as 'whisker').
If you are using R, you can iterate/automate this type of query with the condusco R package. Here's a complete R script that will accomplish this kind of iterative query using both whisker and condusco:
library(bigrquery)
library(condusco)
library(whisker)
# create a simple function that will create a query
# using {{{mustache}}} placeholders for any parameters
create_results_table <- function(params){
destination_table <- '{{{dataset_id}}}.{{{table_prefix}}}_results_{{{year_low}}}_{{{year_high}}}'
query <- '
SELECT *
FROM `bigquery-public-data.samples.gsod`
WHERE year > {{{year_low}}}
AND year <= {{{year_high}}}
'
# use whisker to swap out {{{mustache}}} placeholders with parameters
query_exec(
whisker.render(query,params),
project=whisker.render('{{{project}}}', params),
destination_table = whisker.render(destination_table,params),
use_legacy_sql = FALSE
)
}
# create an invocation query to provide sets of parameters to create_results_table
invocation_query <- '
SELECT
"<YOUR PROJECT HERE>" as project,
"<YOUR DATASET_ID HERE>" as dataset_id,
"<YOUR TABLE PREFIX HERE>" as table_prefix,
num as year_low,
num+1 as year_high
FROM `bigquery-public-data.common_us.num_999999`
WHERE num BETWEEN 1992 AND 1995
'
# call condusco's run_pipeline_gbq to iteratively run create_results_table over invocation_query's results
run_pipeline_gbq(
create_results_table,
invocation_query,
project = '<YOUR PROJECT HERE>',
use_legacy_sql = FALSE
)

Time uploading from excel sheet to mysql

I have to insert time values from excel to mysql like in the image below.
int punchin = (int) row.getCell(1).getNumericCellValue();
System.out.println("::::::::::::::::: "+punchin);
int punchout = (int) row.getCell(2).getNumericCellValue();
System.out.println("::::::::::::::::: "+punchout);
int duration = (int) row.getCell(3).getNumericCellValue();
System.out.println("::::::::::::::::: "+duration);
But I couldn't get the proper value.
Use MySQL for Excel: https://dev.mysql.com/downloads/windows/excel/
It's really easy to use. Once you have installed it, in Excel go to the Data tab and at the end will be Mysql for Excel.
Assuming you've now setup connection to your database, all you now have to do is the following:
Select all the data excluding the columns.
On the side panel, click the database table that you will be putting the data into.
Click Append Excel Data to Table.
Map any columns in the data to your database columns using the tool provided at this poiont.
Click append and wait.
Your data should now be uploaded.

MySQL summary status table to store results of operations

I have a MySQL (5.6) database on my local workstation into which I routinely pull large datasets to perform analysis on. I have a separate SQL script for each dataset that imports the data and reformats it when needed (notably to convert date formats). In addition, I have other scripts that perform detailed analysis on the data.
For quality assurance, I would like to have a table named ImportLog that stores a record to capture the result of each import that is run. This table would look like the following:
ImportName DateRun RowsImported
---------- ------- ------------
ImportASR 2015-08-29 12902
ImportEAD 2015-08-30 18023
ImportHRData 2015-08-30 122376
The column definitions for ImportLog are as follows:
ImportName // the name of the script that is run
DateRun // the date that the script is run
RowsImported // the count of records imported in the run.
At the very end of each script would be the code to write one line to this table with the relevant data. For example, let's say that I ran the script named ImportASR on 8/29/2015 and it imported 12,902 records. At the end of the script, I want to append one record to ImportLog (like the first record in the table above) using something like this:
INSERT INTO ImportLog
VALUES("ImportASR",$DateRun,$RowCunt);
Every time I run one of the import scripts, it would add a row to the ImportLog table with the appropriate data.
My question is: How do I populate the $DateRun variable with the current date and the $RowCount variable with the row count of the newly imported ASR dataset? Or am I trying to approach this from the wrong angle?
First thing this morning I stumbled upon the answer to my problem; it was amazingly simple, and to my surprise it didn't need the use of any variables. The code to put at the end of each import script is something like:
INSERT INTO ImportLog
"Script: ImportASR",
SELECT NOW(),
(SELECT COUNT(*) FROM ASR_Full);
The InportLog table is initially defined like so:
CREATE TABLE LPIS_SearchMatchLog (
ImportName VARCHAR(25),
DateRun DATETIME,
RowCount INT
);
Hope this helps someone else!