So I'm trying to essentially "POST" data from a form in an offline webpage, to an Excel spreadsheet, or CSV, or even just a TXT file. Now I have seen this to be possible using ActiveX in Internet Explorer, however, the methods I saw were pretty particular to the user's code, so I got a bit lost in translation being a beginner. Also some recommended using an offline database using JS, but I'm not sure where to begin with that.
Can anyone offer some insight on this? Is it possible? What would be the best route to take?
There are many ways to accomplish this. The best solution will be the one that suits your specific requirements. Obviously, creating a text/csv file is easier than creating an xls. That said, the basic psuedo code is as follows:
Collect form data
Create (in-memory or temporary) file from collected form data.
Return file as download to client, or just save to some location, OR (best option) insert a row into a database.
Related
I'm new to coding and its a lot of try and error. Now I'm struggling with html tables.
For explanation: I am building an Electron Desktop application for stocks. I am able to input the value via GUI in an html table, and also export this as Excel file. But, every time I reload the app, all data from the table are gone. It would be great to save this data permanently, and simply add new data to the existing table after an application restart.
What's the best way to achieve this?
In my mind, it would the best way to overwrite the existing Excel file with the new work (old and new data from the table), because it would be easy to install the tool on a new PC and simply import the Excel file to have all data there. I don't have access to a web server, so I think a local Excel file would be better than a php solution.
Thank you.
<table class="table" id="tblData" >
<tr>
<th>Teilenummer</th>
<th>Hersteller</th>
<th>Beschreibung</th>
</tr>
</table>
This is the actual table markup.
Your question has two parts, it seems to me.
data representation and manipulation
data persistence
For #1, I'd suggest taking a look at Tabulator, in particular its methods of importing and exporting data. In my projects, I use the JSON format with Tabulator and save the data locally so it persists between sessions.
So for #2, how and where to save the data? Electron has built-in methods for getting the paths to common user directories. See app.getPath(name). Since it sounds like you have just one file to save, which does not need to be directly accessible to the user, appData is probably a good place to store it.
As for the "how" to store it – you can just write a file to that path using Node fs, though I like fs-jetpack too. Tabulator can save data as well.
Another way to store data is with electron-store. It works very well, though I've only used it with small amounts of data.
So the gist is that when your app starts, it loads the data and when the app quits, it saves the data, along with any changes which have been made, though I'd suggest saving after every change.
So, there are lots of options depending on your needs.
My coding knowledge is very basic so please do bear that in mind. Basically there is a encoding service called vid.ly. We have hundreds of videos on there and would like to create an excel spreadsheet with all the information of them.
Vidly accepts API queries in XML and JSON. Below is a basic example of the query I want to make:
http://api.vid.ly/#GetMediaList
Is there a way that I can get Excel to send that query to the Vidly website, receive an XML/JSON response and make a table from it? I have gotten it to work with an XML generated manually but I really want Excel to pull that information automatically.
Sure, you need to write VBA code in excel sheet. Refer to following urls
https://msdn.microsoft.com/en-us/library/dd819156%28v=office.12%29.aspx
http://www.dotnetspider.com/resources/19-STEP-BY-STEP-Consuming-Web-Services-through-VBA-Excel-or-Word-Part-I.aspx
http://www.automateexcel.com/2004/11/14/excel_vba_consume_web_services/
Imagine I've created a new javascript framework, and want to showcase some examples that utilise it, and let other people add examples if they want. Crucially I want this to all be on github.
I imagine I would need to provide a template HTML document which includes the framework, and sorts out all the header and footer correctly. People would then add examples into the examples folder.
However, doing it this way, I would just end up with a long list of HTML files. What would I need to do if I wanted to add some sort of metadata about each example, like tags/author/date etc, which I could then provide search functionality on? If it was just me working on this, I think I would probably set up a database. But because it's a collaboration, this is a bit tricky.
Would it work if each HTML file had a corresponding entry in a JSON file listing all the examples where I could put this metadata? Would I be able to create some basic search functionality using this? Would it be a case of: Step 1 : create new example file, step 2: add reference to file and file metadata to JSON file?
A good example of something similar to what I want is wbond's package manager http://wbond.net/sublime_packages/community
(There is not going to be a lot of create/update/destroy going on - mainly just reading.
Check out this Javascript database: http://www.taffydb.com/
There are other Javascript databases that let you load JSON data and then do database operations. Taffy lets you search for documents.
It sounds like a good idea to me though - making HTML files and an associated JSON document that has meta data about it.
Wanted to know if the following scenario is possible -
I have some data that is in an excel file. I want to make an html page which will have this data inside it (no other source of data). And inside the Html page, will I be able to put textfields, buttons etc for a user to input data and based on that, i need to write queries (jqueries i guess) to show the data that is the result of those queries
Can this be done? I have not done anything so far. I just wanted to know if this is possible and please someone point me in the right direction for me to start. I wanna learn on my own how to do this.
Thanks in advance.
HTML is a markup language - it is the structure of a webpage, and has no mechanisms for storing or processing dynamic data.
You will have to use a client-side language JavaScript + cookies, or a server-side language like PHP + MySQL.
You want to look at using JavaScript in the page. On the server (I presume) you need to read the Excel file, and generate JS objects on the page that hold the values. That is, the JS when run creates a collection of JS objects with the values in it. This script can be embedded in the page so that no other data access is needed.
You can then write more JS linked to the buttons that select data out of these objects, and displays them on the page. You probably don't want to do this from scratch -- there are good JS libraries and frameworks to leverage. Consider either GWM or YUI.
Perhaps the simplest way is to open the file in Excel and save it as text (tab-separated; comma-separated would do, too), then insert this text data into your HTML document between the tags <script type="text/plain"> and </script>. You can then write, in a rather straighforward way, JavaScript code that reads the content of this element and constructs a JavaScript array of objects (or some other suitable data structire) from it. It will then be easy to access the data in JavaScript.
This will make it possible to run queries and display data. Modifying the data would be a completely different matter.
I'm about to start writing a program which will attempt to extract data from a Google Code site so that it may be imported in to another project management site. Specifically, I need to extract the full issue detail from the site (description, comments, and so on).
Unfortunately Google don't provide an API for this, nor do they have an export feature, so to me the only option looks to be extracting the data from the actual HTML (yuck). Does any one have any suggestions on "best practice" from attempting to parse data out of HTML? I'm aware that this is less than ideal, but I don't think I have much choice. Can anyone else think of a better way, or maybe someone else has already done this?
Also, I'm aware of the CSV export feature on the issue page, however this does not give complete data about issues (but could be a useful starting point).
I just finished a program called google-code-export (hosted on Github). This allows you to export your Google Code project to an XML file, for example:
>main.py -p synergy-plus -s 1 -c 1
parse: http://code.google.com/p/synergy-plus/issues/detail?id=1
wrote: synergy-plus_google-code-export.xml
... will create a file named synergy-plus_google-code-export.xml.