Appending Vertical Data into a Horizontal Table in Access - ms-access

I'm attempting to take stock data pulled from Google and create tables for each ticker to record historical market data in Access. I am able to easily import the delimited text data into Access, the problem is, I am pulling multiple tickers in one pull. When imported the data is vertical as such:
I know how to easily do this in Excel, yet I am having the worst time figuring out how to automate it in access. The reason I am attempting to automate it is that the database will be pulling this data and updating it every 15 minutes for over 300 ticker symbols. Essentially in the example above, I need to find 'CVX' then in a new table have it list out the data below it horizontally like so:
I have been searching online and am literally going bananas because I can't figure out how to do this (which would be simple in Excel). Does anyone have any experience manipulating data in this way or know of any potential solutions?

After some more research I realized the data I was getting is in JSON format. After digging a little more I was able to find online converters This one worked particularly well. After converting the file to CSV it was easy to import the data into Access.

Related

Automate Data Feeds challenges

I get monthly data feeds from source that i need to import into database but the problem is that the feed changes every month, sometimes there are more columns and sometimes there are less columns. There is no consistency whatsoever.
How do I manage and automate these data feeds?
Rather than expertise and out of box thinking, I think this needs a forthright response that automation is impossible unless the data provider commits to a particular column structure.

What is the best way to routinely import a CSV or XML file into a MS access database?

I have an Access database that keeps track of many different aspects of my companies performance and I would like to add functionality to keep track of the hours the employees are working.
The hours are all kept track of on a website called timetracker. They have a few reporting options including XML and CSV files. The site has a favorite report feature to get the same data in the format that I want it every week.
What I would like to do is find the best process for getting the data from this website, into a table in my database that I can reference.
I will not be the one executing whatever process I come up with and I would really like it to be as easy as possible for whoever it is that does have to do it.
Right now I have a linked table that is an XML file in our SharePoint folder. I was thinking that maybe we could just run the report and download the file every week then just save it over the old file with the correct sheet names and it should update.
What I am wondering is if anyone can come up with an easier process for doing this that would take the least amount of time and be easiest to write down instructions for that anyone could execute.
(Would it maybe be possible to create some sort of macro to actually download the report automatically?)

Access throwing unparseable records error from Cognos Report

Before I begin, yes I know everything I am about to explain is back-asswards but at the moment it is what I have to work with.
My organization is in the middle of a Cognos 10 BI implementation, at the moment we are having HUGE performance issues with the data cubes, severely hampering our end users ability to slice data in an ad-hoc fashion. Historically we used large data extracts from SAP, manipulated in ms-access to provided a data source that was updated daily that end users could Pivot around in Excel.
As this is NOT transactional data, it worked as we never had more than a half million records, performance was never an issue.
As our implementation team has been unable to provide management with functioning data cubes we can use to provide static views and reports I have been tasked with using Cognos data extracts to re-create the old system temporarily.
The issue I am running into is that randomly, 3 times a week, 1 time the next, the files will contain unparsable records. I doubt it is a special character issue as I can re-download the file and it functions fine the 2nd or 3rd time.
Does anyone have any experience with something similar? I realize the data sets provided by Cognos were not designed for this purpose, but it seems strange that 20 percent of the files will contain corruptions. Also strange is that when I select a .xls spreadsheet as the download format, it seems to be a Unicode text file with the extension changed to .xls
Any insight would be appreciated.
EDIT: Diffing the files will be my next experiment, even though they are byte for byte comparable, I HAVE however compared the specific records that are unparsable in one file, yet parsable in the next and have not found any difference.
As for the import, I manually convert the file to a Unicode text and import it from that format.

Best way to store Excel data and access parts of it

I am looking for the best way to store a set of data on my server, then from within an App I am building, retrieve random parts of that data. I am building an App that will present the end-user with study related questions. I have 40 subjects, with 50 multiple choice questions per subject, a few sample questions for each subject and only 1 correct answer per question. I have been considering using phpMyAdmin going down the SQL route, but considering I already have all of my data neatly arranged in an excel sheet with columns for 'Subject' 'Sample question bank' 'Real question bank' 'Answer bank' with the respective excel sheets (containing the actual content) listed under the Sample, Real and Answer Question Bank columns.
Is entering in and restructuring manually, all of my data, really the only/best way for me to move forward? Or is there another method, perhaps one of storing and accessing Excel files on a server, and being able to call data from a given column. The way my data is arranged, I will never need a specific question, the only pair of data that must match is the proper Answer to a question. All of my other calls for data within the Application will be random. i.e. I will be populating the App with 20/30/40 random questions from within a particular subject.
I apologize in advance if I am violating any rules or if my etiquette is improper. Thanks very much for anyones input or suggestions.
Import your existing spreadsheet into mysql.
Create a table with the specified column names.
From Excel save as CSV file
install mysql, if you haven't
setup admin account.
launch mysql
to import the CSV.
from mysql use INFILE
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
and your whole spreadsheet is now in mysql.
Then use php or perl to make a web-based interface. Of course, other programming languages are just as good.

Automate excel sheet download, modify, and upload to MySQL database

I'm running an eCommerce store and I am being provided with a daily data feed of info like Item Number, Price, Description, Inventory Status, etc.
Some notes:
I know the URL of the .xls file
I need to modify the Item Number on the .xls for all products to add two letters to the beginning
Price and Inventory Status on the website database need to be updated daily for each item, matched up by Item Number
If Item Number does not exist, new item is created with all information contained in the excel sheet
This all needs to be fully automated (this is the part I need the most assistance with)
I used to have a company that took care of this for $50/month, but I now have access to the data myself. My programming experience is limited to web languages (PHP, HTML, etc.) and some basic C++. A side question would be whether or not it's worth taking this responsibility upon myself or if I should continue working with a company who has systems in place to handle this already.
If you can get the CSV instead of the XLS, load it yourself into a new table, update what you need and then insert the rows into your production table.
If you're stuck with the XLS, find a library for PHP that will allow you to parse it and then write the records to the new table.
As for your second question, yes, it's absolutely worthwhile to cutout the thieves who are charging you $600/year for something that should take you an hour or two to write yourself.
Good luck.
There are two suggestions here. One involves using mysqlimport, the other TOAD. If you need more explanation of how to implement this, expand your question.