We have used a spreadsheet for our client management so far.
Now we are growing and want more users to simultaneously work on the data. We don't want to introduce a CRM because we want to store the data locally and already have a pretty sophisticated script that takes the data from the rows of the spreadsheet via the clipboard and runs operations on it.
We have a Synology NAS that supports MariaDB.
I thought this should be a pretty simple database use case for a HTML website in a browser:
Filtering for existing entries
Manipulating existing entries by
row
Adding new entries
Also, it'd be great to restrict visible entries to a certain amount of rows.
Does anyone know of existing templates which support that?
Thank you!
Related
I’d like to push some data from MySQL into Google Sheets. Once I’ve edited my data in Google Sheets, I’d like to push my edited data back into MySQL. Ideally, I’d even like to schedule it to update it every hour, so my data is always live and matches what's in my MySQL.
I’ve looked into Google Sheets Script and it seems that it enables you to type in a SQL query into a cell in Google Sheets and retrieve your queried data. However, the main issue, even though I find a proper way to export my data to sql, is that I have hundreds of tabs across multiple spreadsheets and I’d like to find a way to avoid to manually repeat this job for every tab.
Please have in mind that it is for someone on my team who can’t figure out querying with SQL, has a hard time navigating MySQL, and that I don’t want to train in SQL. I would just like this person to edit Google Sheets and these edits to be reflected back in MySQL, without this person ever having to go into my SQL database.
I think you can also use Google Apps Script to push back the data in mySQL. However, I don't know how scalable this solution would be.
Some tools exist to export data from SQL to Google sheets, like Zapier and add-ons such as Kloud and Blockspring. The thing with Blockspring is that it's targeted to people that are familiar with SQL queries. And none of those solutions allow you to push the edit data back to your database (at least, that I know of... would be very interested if it is otherwise).
So an option would be to use Actiondesk to sync your SQL database with your Google Sheets. You can schedule the synchronisation every hour (even every ten minutes actually), and it would be easy to add new sheets/tab anytime you need to (it's just a matter of few clicks).
Hope this helps!
Disclaimer: I am a back-end engineer at Actiondesk and personally implemented the Googlesheets integration, so I might be kind of biased (but at the same time, I might be the best person to answer your wildest questions on that regard so feel free to shoot them)!
It's possible to connect to MySQL with Apps Script, but you need to disable your firewall or whitelist all of Google's IP addresses (which are subject to change). As you mentioned you'll also need to set up the script for every Sheet or release the script as an add-on. You are also likely to run into difficulty writing back to the database (e.g. handling date formats).
SeekWell lets you automatically send data from SQL to Sheets and can also sync data from Sheets back to a database. It's built specifically to handle this use case, so it will get you up and running faster, but it's a commercial / paid product.
Disclaimer: I built this.
I have an MS Access 2016 application that a few people use in one department. I know this whole thing has web dev written all over it but this access database has been their process for a while and there is no time right now to switch over.
Recently, a different department wants to use this application, but having their own copy. Currently, if I need to make changes, I'll make the changes in a copy of the app, they send me a current version when I'm ready to import their data, I import it and send them back a new one. However, currently I copy the data table by table and past it into the new database. This is inefficient and tedious, and now with 2 sets of data I'd be doing this for, that's crazy. There's over 20 tables so I don't want to have to manually copy over 40+ tables across the 2 apps for even the smallest change like altering a message to the user.
I know I can copy the code so I can avoid importing the data, but sometimes for big changes I'll change over 15-20 vba files.
So, a couple questions:
1.Is there a way to generate insert statements for the entire database that I could run in a script? So when I create the new copy I just upload 1 file and it populates all the data?
2.Are there any kind of dev tools that will help this process? Right now I'm thinking that it's just a downfall of creating an MS Access app, but there must be some way that people have made the "new release" process easier. My current system seems flawed and I'm looking to have a more stable process.
EDIT:
Currently I have all my data stored locally, attached to the same access file as the front end. Since I will have 2 different departments using the same functionality, how do I manage the data/the front-end? These 2 departments should have their own access file to enter data using the forms, so having 1 front end between the 2 departments won't work.
Also, should I create 2 separate back-ends? Currently I would have nothing to distinguish what is being inserted/changed/deleted from one department from the other. If I were to attach a field specifying who entered the record, that would require a complete overall of all my queries which I don't have the time for as there are deadlines I need to meet.
First thing is to split the database. There is a wizard for this.
Then you can maintain the frontend without touching the real data.
Next, consider using a script to distribute revised versions of the frontend. I once wrote an article on one proven method to handle this:
Deploy and update a Microsoft Access application in a Citrix environment
I'm looking into using google sheets as some sort of aggregation solution for different data sources. It's reasonably easy to configure those data sources to output to a common google sheets and it's need to online for sharing. This sheet would act as my raw, un-treated data source. I would then have some dashboards/sub-tables based on that data.
Now, early tests seem to show I'm going to have to be careful about efficiency as it seems I'm pushing against the maximum 2 millions cells for spreadsheets (we're talking about 15-20k rows of data & 100 or so columns). Handling the data also seems to be pretty slow (regardless of cells limits), at least using formulas, even considering using arrays & avoiding vlookups etc...
My plan would be to create other documents (separate documents, not just adding tabs) & refer to the source data through import-range & using spreadsheet-key. Those would be using subsets of the data only required for each dashboards. This should allow me to create dashboard that would run faster than if setup directly off my big raw data file, or at least that's my thinking.
Am I embarking on a fool's errand here? Anyone has been looking into similarly large dataset on google docs? Basically trying to see if what I have in mind is even practical or not... If you have better ideas in terms of architecture please do share...
I ran into a similar issue once.
Using a multi layer approach like the one you suggested is indeed one method to work around this.
The spreadsheets themselves have no problem storing those two million cells, it's the displaying of all the data that is problematic, so accessing it via Import or scripts can be worthwhile.
Some other things I would consider:
How up to date does the data need to be? Import range is slow and can make the dashboard you create sluggish, maybe a scheduled import with the aggregation happening in Google Apps Script is a viable option.
At that point you might even want to consider using BigQuery for the data storage (and aggregation), whether you pull the data from another spreadsheet in this project or a database that will not run into any issues once you exceed 2 million elements would be future proof.
Alternatively you can use fusion tables* for the storage which are drive based , although I think you cannot run sophisticated SQL queries on it.
*: You probably need to enable them in Drive via right click > more > Connect more apps
I recently inherited a website and they have a simple back-end area which was created using phpmaker. The back-end displays various MYSQL database tables.
There are two tables which hold registration information related to promotions/contests the company runs online. The client wants to begin archiving the registration data monthly, but still have the data accessible for future export or review.
So, can anyone tell me what the best approach would be to achieve this? I read about partitioning and Maatkit, but I'm not sure which - if either - would be a smart choice.
I would prefer to keep the table names the same because the table name is referenced in several instances within the PHP code running the promo/contest applications. I would also like for everything to be 'automatic' or at least executed at the click of a button; though I realize that might not be completely realistic.
I should note that I do not have the phpmaker project file and have been unable to obtain it.
Any help on this matter would be a great help.
MK-Archiver This is a good way to archive live mysql database tables
What MK- Archiver does is to archive rows from a table to another table and/or a file
For some security issues I'm in an envorinment where third party apps can't access my DB. For this reason I should have some service/tool/script (dunno what yet... i'm open to the best option, still reading to see what I'm gonna do...)
which enables me to generate on a regular basis(daily, weekly, monthly) some csv file with all new/modified records for a certain application.
I should be able to automate this process and also export at any time a new file.
So it should keep track for each application which records he still needs.
Each application will need some data in some other format (csv/xls/sql), also some fields will be needed for some application and some aren't... It should be fairly flexible...
What is the best option for me? Creating some custom tables for each application? Based on that extracting modified data?
I think you best thing here, assuming you have access to the server to let you set this up is to make a small command line program that can do the relativley simple task you need. Languages like pearl are good for this sort of thing I do believe.
once you have that 'tool' made you can schedule it through the OS of the server to run ever set amount of time. Either schedule task for a windows server or a cronjob for a linux server.
You can also (with out having to set up the scheduled task if you don't / can't want to) enable this small command line application to be called via 'CGI' this is a special way of letting applications on the server be executed at will by a web user. If you do enable this though, I suggest you add some sort of locking system so that it can only be run every so often and to stop it being run five times at once.
EDIT
You might also want to just look into database replication or adding read only users. This saves a hole lot of arseing around. Try to find a solution that dose not split or duplicate data. You can set up users to only be able to access certain parts of the database system in certain ways, such as SELECT data