ABS.STAT API JSON for new period data - json

i am trying to automate downloading online statistics into my financial models.
i am using australian statistics database http://stat.data.abs.gov.au/index.aspx? to automatically pull cpi data via an api json link converted into a power query and excel's "get data from web" functionality.
however, i can only bring the existing data, but can't make it add a new period of data when it's published on the website. i'd be happy with either adding new data column or replacing existing data e.g. so that i always have last four quarters.
i'm not an expert and i can't find anything useful because i'm probably not using the right terms in my google search!
any advice would be much appreciated

Related

Export my analytics data and put them in a database

I am looking to export the analytics data towards a database sql. Do you know one tools who could help me?
Do you know how I can see on Google analytics the traffic resulting from a particular URL??
Thank you all!
You have several options:
UI export: in the top/right corner of your reports you should have an option to download data in various formats (XLS, CSV...)
API: you can use the reporting API to get it out in a programmatic/automated way
One thing you won't be able to do with the free version no matter what you try:
Reconstruct the entire analytics data: whether with the UI or API, you're limited to querying only 7 dimensions maximum at a time (eg ga:country, ga:deviceCategory etc...), and cannot combine certain dimensions together (no official list available, it's trial and error to find out), whereas there are dozens of dimensions available.
So the question for you becomes:
How much resources do I want to invest into partially reverse-engineering Google Analytics vs. the value it brings me vs. what it would cost to get alternative analytics solutions?
I found a cloud based solution which exports raw google analytics data to MySQL database. Setup is simple, all you need to do is add your Google Analytics connection and a database to which the data needs to be exported.
MySQL, PostgreSQL, SQL Server and BigQuery are the supported destinations. It creates a few custom dimensions in your Google Analytics account and Tag in Google Tag Manager to send hits to Google Analytics. Data is exported from Google Analytics to the selected destination every day.
I have been using it for last three months now. Hope this helps.
Exporting the analytics data is a thorny one.
My understanding is that paid GA usage allows the export of all collected GA data.
But free usage does not.
For free usage, all you are going to be able to do, realistically, is to create a report over your GA data (in Data Studio or Google Sheets) that contains the rows and columns you want, and then collect this information and squirt it into a SQL table. You are also liable to come up against sampling.
Re traffic from particular URL, the news is better: just filter on Hostname and Page.

Appending Vertical Data into a Horizontal Table in Access

I'm attempting to take stock data pulled from Google and create tables for each ticker to record historical market data in Access. I am able to easily import the delimited text data into Access, the problem is, I am pulling multiple tickers in one pull. When imported the data is vertical as such:
I know how to easily do this in Excel, yet I am having the worst time figuring out how to automate it in access. The reason I am attempting to automate it is that the database will be pulling this data and updating it every 15 minutes for over 300 ticker symbols. Essentially in the example above, I need to find 'CVX' then in a new table have it list out the data below it horizontally like so:
I have been searching online and am literally going bananas because I can't figure out how to do this (which would be simple in Excel). Does anyone have any experience manipulating data in this way or know of any potential solutions?
After some more research I realized the data I was getting is in JSON format. After digging a little more I was able to find online converters This one worked particularly well. After converting the file to CSV it was easy to import the data into Access.

Automate excel sheet download, modify, and upload to MySQL database

I'm running an eCommerce store and I am being provided with a daily data feed of info like Item Number, Price, Description, Inventory Status, etc.
Some notes:
I know the URL of the .xls file
I need to modify the Item Number on the .xls for all products to add two letters to the beginning
Price and Inventory Status on the website database need to be updated daily for each item, matched up by Item Number
If Item Number does not exist, new item is created with all information contained in the excel sheet
This all needs to be fully automated (this is the part I need the most assistance with)
I used to have a company that took care of this for $50/month, but I now have access to the data myself. My programming experience is limited to web languages (PHP, HTML, etc.) and some basic C++. A side question would be whether or not it's worth taking this responsibility upon myself or if I should continue working with a company who has systems in place to handle this already.
If you can get the CSV instead of the XLS, load it yourself into a new table, update what you need and then insert the rows into your production table.
If you're stuck with the XLS, find a library for PHP that will allow you to parse it and then write the records to the new table.
As for your second question, yes, it's absolutely worthwhile to cutout the thieves who are charging you $600/year for something that should take you an hour or two to write yourself.
Good luck.
There are two suggestions here. One involves using mysqlimport, the other TOAD. If you need more explanation of how to implement this, expand your question.

Serialized data in a MySql Database to use in a Business Intelligence tool

I have a database (MySql) and need to store some results from a web service monthly.
The data can have 10 results today but may have 200 next month.
I need to use a BI tool to create charts and what not.
Someone proposed to serialize the data and save the blobs in the database, while the solution seems to work, I have a gut feeling that when the time comes to hook it up with the BI tool, hell will break loose.
Has anyone had this issue before?
Thanks
Edit: adding extra info.
The problem is that we haven't chosen the BI tool yet. But what it needs to do is create charts for the results. Some of the results come from Google Analytics. So we will be charting number of visitors to a site for the last 6 months. Or Number of viewed pages.
The answer is simple: do not store Serialized data in a database.
Do some research, atomize your data and create data structure.
Once you've done it, you will be able to use any BI tool in the world.
That's the purpose of a database and what distinguishes a database from a flat file.

Notes database to MySQL (with CF?) / or how to get the NSF-datastructure

I have a commercial ColdFusion application, running on a MySQL database. A possible new client has approached me, they have been working in a Lotus Notes environment (and their own database) for many years now. Ofcourse they want to migrate their data to my application, before making the move.
I'm trying to get a grip on how to get a thorough feeling of the data, structure and interdependencies in their current database-application. Are there any tools to see a database-structure (like in a RDBMS) of a NSF-file, or is there anyway to dump the structure using ColdFusion etc....I don't have any handson experience with Lotus Notes (I do in the meanwhile have a local Lotus client and their database).
I need a good startingpoint to be able to determine whether or not I can find a way to migrate the data.
Any ideas??
thanks
Bart
To get at the data in Notes, a good option is to use NotesSQL which can be found here:
A quick overview of the Notes data structure is this: Notes is a document-centric database, with non-relational data contained within each document. Notes Databases (NSFs) contain any number of Notes Documents, which in turn contain any number of items that hold data. Each Notes Document can have a different set of items, and thus different data in it. While that sounds like a horrible mess, usually the documents have similar data based on the form used to create the documents.
This all leads to why there is no simple way to get data out of Lotus Notes. There are a few other options, which may or may not be useful depending on how much data you have to migrate.
I personally like using XML to extract data from Lotus Notes. You can do so by creating XML views within a Notes database. IBM has a tutorial that looks helpful.
Using Java or LotusScript, you can write code to extract data from the documents to any format you wish (CSV, XML, TXT, etc)
If it's not a lot of data, you may find getting the data into an Excel format is the simplest intermediary step. Long ago I wrote an add-in tool for exporting data from Lotus Notes to Excel, which may help you. Or you can use the "Edit > Copy Selected To Table" feature in the Lotus Notes client to copy what is visible in a Notes View to the clipboard, and then paste that into Excel. In that scenario, you'd want to edit the views so they show all the data you need.
I hope this helps!