Best way to store Excel data and access parts of it - mysql

I am looking for the best way to store a set of data on my server, then from within an App I am building, retrieve random parts of that data. I am building an App that will present the end-user with study related questions. I have 40 subjects, with 50 multiple choice questions per subject, a few sample questions for each subject and only 1 correct answer per question. I have been considering using phpMyAdmin going down the SQL route, but considering I already have all of my data neatly arranged in an excel sheet with columns for 'Subject' 'Sample question bank' 'Real question bank' 'Answer bank' with the respective excel sheets (containing the actual content) listed under the Sample, Real and Answer Question Bank columns.
Is entering in and restructuring manually, all of my data, really the only/best way for me to move forward? Or is there another method, perhaps one of storing and accessing Excel files on a server, and being able to call data from a given column. The way my data is arranged, I will never need a specific question, the only pair of data that must match is the proper Answer to a question. All of my other calls for data within the Application will be random. i.e. I will be populating the App with 20/30/40 random questions from within a particular subject.
I apologize in advance if I am violating any rules or if my etiquette is improper. Thanks very much for anyones input or suggestions.

Import your existing spreadsheet into mysql.
Create a table with the specified column names.
From Excel save as CSV file
install mysql, if you haven't
setup admin account.
launch mysql
to import the CSV.
from mysql use INFILE
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
and your whole spreadsheet is now in mysql.
Then use php or perl to make a web-based interface. Of course, other programming languages are just as good.

Related

Parsing JSON file with excel but only certain rows

I have a Hospital pricing JSON file that management wants me to parse but the file is over 4 million rows and as all of you know Excel can only handle 1 million lines. Fortunately, they only want pricing from a certain hospital group. I know how to do a basic parse of JSON files using excel but don't know how to manipulate the parse so it only pulls down data matching a certain criteria.
I don't see a specific question here so I'll give a broad answer. I don't think Excel is the right tool for the job here. You're better off using either a scripting or programming tool to filter out the rows from the JSON file that you need. You can then also reuse the script you wrote when another one of these questions comes in. A simple and easy to use contender here would be python and its json module.

what database design for CRFs?

I am actually working on a project that involves the managing of the E-CRFs (electronic case report forms) ay you may know, CRFs are documents used by researchers to collect data by asking several questions and having their answers on paper.
To make a web interface that releaves the reaserchers from paper work.
My question concerns the design of the database. the ECRFs are not static, if they were, I would have to create a simple table in the database and every field would correspond to a question in the CRF. but what I want is to have a database that allows me to create my own CRF with variable number of fields everytime, or even from a same CRF I could have updates in which I add or retrieve a field.
How do I proceed for the database design ??
thank you.
Use a relational database only to store your metadata (e.g. who filled in the form) and store the rest in XML or JSON format in a BLOBS or flat files.

Access - Linked Excel Sheet query

I am currently working on a database which will bring a number of excel sheets together. I have created a link between the ones in which I need and set up relationships in Access.
I have first-year degree experience of Microsoft software packages. I am not going to move from Excel to Access as other team members are more comfortable using it. However running things like reports, creating forms and querying data can be easier in Access.
The Problem:
I am trying to query data from a linked spreadsheet and it sometimes works and sometimes doesn't. Often more than none my queries return blank when I know they shouldn't.
Is this something to do with the table being linked and not an access table?
Please see an example query that I have set up
Thank you in advance.
If I assume that Status On is a Date field then your criteria is treating this as text, and this expression:
Like "*/*/2013"
may cause unexpected return results, depending particularly on the default Date format of Excel. Use the criteria:
Year([Status On])=2013
which will be much more reliable, not depending on the formatting of the date, purely on the fact that it is a recognisable date.
I don't usually have issues linking to Excel files unless:
The file is Open
The Excel files has links to other files or macros
It is corrupted in some way.
If you are linking to an Excel file then it, ideally, should be a very simple file with, preferably, no other content than a single table of data.

Automate excel sheet download, modify, and upload to MySQL database

I'm running an eCommerce store and I am being provided with a daily data feed of info like Item Number, Price, Description, Inventory Status, etc.
Some notes:
I know the URL of the .xls file
I need to modify the Item Number on the .xls for all products to add two letters to the beginning
Price and Inventory Status on the website database need to be updated daily for each item, matched up by Item Number
If Item Number does not exist, new item is created with all information contained in the excel sheet
This all needs to be fully automated (this is the part I need the most assistance with)
I used to have a company that took care of this for $50/month, but I now have access to the data myself. My programming experience is limited to web languages (PHP, HTML, etc.) and some basic C++. A side question would be whether or not it's worth taking this responsibility upon myself or if I should continue working with a company who has systems in place to handle this already.
If you can get the CSV instead of the XLS, load it yourself into a new table, update what you need and then insert the rows into your production table.
If you're stuck with the XLS, find a library for PHP that will allow you to parse it and then write the records to the new table.
As for your second question, yes, it's absolutely worthwhile to cutout the thieves who are charging you $600/year for something that should take you an hour or two to write yourself.
Good luck.
There are two suggestions here. One involves using mysqlimport, the other TOAD. If you need more explanation of how to implement this, expand your question.

Notes database to MySQL (with CF?) / or how to get the NSF-datastructure

I have a commercial ColdFusion application, running on a MySQL database. A possible new client has approached me, they have been working in a Lotus Notes environment (and their own database) for many years now. Ofcourse they want to migrate their data to my application, before making the move.
I'm trying to get a grip on how to get a thorough feeling of the data, structure and interdependencies in their current database-application. Are there any tools to see a database-structure (like in a RDBMS) of a NSF-file, or is there anyway to dump the structure using ColdFusion etc....I don't have any handson experience with Lotus Notes (I do in the meanwhile have a local Lotus client and their database).
I need a good startingpoint to be able to determine whether or not I can find a way to migrate the data.
Any ideas??
thanks
Bart
To get at the data in Notes, a good option is to use NotesSQL which can be found here:
A quick overview of the Notes data structure is this: Notes is a document-centric database, with non-relational data contained within each document. Notes Databases (NSFs) contain any number of Notes Documents, which in turn contain any number of items that hold data. Each Notes Document can have a different set of items, and thus different data in it. While that sounds like a horrible mess, usually the documents have similar data based on the form used to create the documents.
This all leads to why there is no simple way to get data out of Lotus Notes. There are a few other options, which may or may not be useful depending on how much data you have to migrate.
I personally like using XML to extract data from Lotus Notes. You can do so by creating XML views within a Notes database. IBM has a tutorial that looks helpful.
Using Java or LotusScript, you can write code to extract data from the documents to any format you wish (CSV, XML, TXT, etc)
If it's not a lot of data, you may find getting the data into an Excel format is the simplest intermediary step. Long ago I wrote an add-in tool for exporting data from Lotus Notes to Excel, which may help you. Or you can use the "Edit > Copy Selected To Table" feature in the Lotus Notes client to copy what is visible in a Notes View to the clipboard, and then paste that into Excel. In that scenario, you'd want to edit the views so they show all the data you need.
I hope this helps!