displaying spreadsheet/grid on a browser after loading from Excel - html

I need some help with picking the right tools for a project.
Would like to import a bunch of excel spreadsheets (preferably server side), join a bunch of columns, and display a single responsive, interactive grid/table/html5-spreadsheet on the browser to the end user. What are some of the better tools to go about this? Would Solr or Logagent fit the bill?
Is there a mechanism (in the tool for e.g.) so that what is being displayed to the end user on the browser via Data Tables is secure i.e. only data fields accessible / allowed for the user to view are only displayed?

You question is too generic to provide a detailed answer.
Whenever I need to have Excel functionality in the browser I use SheetJS:
http://sheetjs.com/
Check out their demos:
http://sheetjs.com/demos/manifest.html
Hope this helps.

Related

HTML5 offline storage - photos and other content

I have a couple of questions about offline storage in HTML5. It's not an area I am that familiar with so I was hoping someone could shed some light.
I want to develop a web based system (for mobile) that a user could potentially use offline. Obviously the first time they'd use it (and any time they need to sync data thereafter), internet access would need to be required.
Some text data would need to be downloaded in json format. Basically this will be a list of certain items that will appear in auto-complete forms in the app (ie. even if the user is offline and they want to enter a type of animal for example, they'd type in "Gir" and "Giraffe", being one of the items downloaded in that json list, would appear in the auto-complete box.
I would like the user to be able to take photos at certain points. This would need to be saved internally, such that when internet access is available it can be synced/uploaded to some web server.
Could someone tell me if what I am thinking of is achievable?
Thanks
Use a cache manifest to keep offline portions of your app cached. You can also store key/value data in Local Storage, including text and blobs (which you should be able to convert to photos).
This demo (and its documentation) may be a useful resource for offline photo storage.

get data from website not showing up in html mark-up code

I am trying to retrieve data from a website for which the parameters that you need to define does not show up in the url ie. http://www.vmm.be/webrap/ibmcognos/cgi-bin/cognosisapi.dll?b_action=cognosViewer&ui.action=run&ui.object=%2fcontent%2ffolder[%40name%3d%27Water%27]%2ffolder[%40name%3d%27Afvalwater%27]%2freport[%40name%3d%27Individuele%20analyseresultaten%20per%20RWZI%27]&ui.name=Individuele%20analyseresultaten%20per%20RWZI&run.outputFormat=HTML&run.prompt=false&ui.backURL=%2fwebrap%2fibmcognos%2fcgi-bin%2fcognosisapi.dll%3fb_action%3dxts.run%26m%3dportal%2fcc.xts%26m_folder%3di5DDA04E5A00C4B6AB6DF44BB4FAD7CEC&p_RwziNr=51&run.prompt=false
how can I extract the data for different years and parameters in a programmatic way?
I am using matlab's urlread but since the data I want to import does not show up in the html-code (I have checked this with the Web Developer Toolbar in FireFox), nothing is being read in. I have no experience with websites, only matlab and c-programming, so I have no idea how the data can be shown in the browser if it is not showing up in the html-source code so could someone point me to the right direction on how to get this job done? Is it at all possible? I hope so because I will have to repeat this for around 500 measuring stations, each 10 years so I am not planning on copying the required data manually as I did before when I just needed one station.
it turns out it is not possible to do what I want in Matlab. I however did manage to get the required data programmatically by using a combination of Selenium with C# and chrome driver. It's slow, but it's working and I can do other stuff in the meanwhile so I can recommend it to everyone who is downloading data from servers in a tedious way.

Offline site/application to propagate a form

We are building an offline version of our online store.
This is for reps to take with them on a tablet when they are out. The reason it needs to be offline is because there will be no connection in a lot of the places and we aren't using tablets with 3/4g connectivity.We use Windows 8.1/RT based tablets.
Since php relies on a server and the tablets cannot be installed with xampp or the like, I have rebuilt the site using html and css.
So far this works and has some flexibility in it. The website is turned into an application using googles create application button in its tools (though technically it's not really an application). It has all the product info, pictures and videos the reps need. Another advantage to this is that the application and files are stored on a one drive cloud account shared to all the tablets. This way, I can update the app/website from my machine and have it up to date on all the reps machines. This current set up works for now but we are looking to add some more functionality.
What we want is a button on each product that will let a rep add an amount of that particular product to a quote form. Because each product sits on a different page, it can't be in the one page form. So as the rep is pressing the buttons on each product, they are getting stored somewhere. Then at the end, the rep can turn all those values into a word doc/pdf/excel by hitting a final submit button.
I have looked at web storage in html 5 but still not quite sure if I can get what I need using it.
Going through the explination here:
http://diveintohtml5.info/storage.html
looks like sites can store info but not sure how to turn this into a form or document at the end. This document is what the reps will email back to the head office.
Has anyone got any pointers on what I could do. Since the site/app has been created in html already, I would like to just build into the existing framework. Are there any other pieces of software that I could use? I do remember using spreadsheet converter to turn an excel file into a web form that exported a pdf but the form needs to be on a single page.
All help appreciated.
Thanks

windows tool to view website client content without browser

Per the title, I am looking for a tool or some sort of initiative that's already been undertaken by other developers to simply grab data off of websites so one can navigate them without looking at them in the browser. I am fully aware of how most pages work so what I would like to do is just look at the data that's being pulled from them per windows technology that's already (hopefully) been written. Does this make sense? Here is an example of what I would like to see in a tool:
a windows interface that gives me data about a webpage (menus, submenus,
button names/captions, etc...
be able to execute transactions on those pages by specifying what to do
through the tool's interface (click button, download image, etc..)
does anyone know of a tool out there to do such things?
The closest "program" that comes to mind is
WWW::Mechanize
Advertised as
Handy web browsing in a Perl object
This can in fact be used on Windows, however you
will need Perl.

How to take screenshot of rendered HTML page

Our web analytics package includes detailed information about user's activity within a page, and we show (click/scroll/interaction) visualizations in an overlay atop the web page. Currently this is an IFrame containing a live rendering of the page.
Since pages change over time, older data no longer corresponds to the current layout of the page. We would like to run a spider to occasionally take snapshots of the pages, allowing us to maintain a record of interactions with various versions of the page.
We have a working implementation of this (Linux), but the snapshot process is a hideous Python/JavaScript/HTML hack which opens a Firefox window, screenshotting and scrolling and merging and saving to a file. This requires us to install the X stack on our normally headless servers, and takes over a minute per page.
We would prefer a headless implementation with performance closer to that of the rendering time in a regular web browser, but haven't found anything.
There's some movement towards building something using Mozilla source as a starting point, but that seems like overkill to me, as well as a maintenance nightmare if we try to keep it up to date.
Suggestions?
An article on Digital Inspiration points towards CutyCapt which is cross-platform and uses the Webkit rendering engine as well as IECapt which uses the present IE rendering engine and requires Windows, natch. Nothing off the top of my head which uses Gecko, Firefox's rendering engine.
I doubt you're going to be able to get away from X, however. Since CutyCapt requires Qt, it requires either X or a Windows installation. And, similarly, IECapt will require Windows (or Wine if you want to try to run it under Linux, and then you're back to needing X). I doubt you'll be able to find a rendering engine which doesn't require Qt, Gtk, GDI, or Cocoa, and therefore requires a full install of display libraries.
Why not store the HTML that is sent out to the client? You could then use that to redisplay in a webbrowser as a page to show what it looked like.
Using your webanalytics data about use actions, you could they use that to default the combo boxes, fields etc to the values the client would have had, even change the CSS on buttons, etc, to mark them as being pushed.
As a benefit, you don't need the X stack, don't need to do any crawling or storing of images.
EDIT (Re Andrew Moore):
This is were you store the current CSS/images under a version number. Place an easily parsable version number in a comment in the HTML. If you change your CSS/images and use the existing names, increment the version number in the HTML output sent out.
The system that stores the HTML will know that it needs to grab a new copy and store under a new number. When redisplaying, it simply uses the version number to determine which CSS/image set to use.
We currently have a system here that uses a very similiar system so we can track users actions and provide better support when they call our help desk, as they can bring up the users session and follow what they did, even some-what live.
you can even code it to auto-censor sensitive fields when it is stored.
depending on the specifics of your needs perhaps you could get away with using one of the many free webpage thumbnail services? snapcasa, for example lets you generate thousands per month / no charge no advertizing .. (not ever used, just googled 'free thumbnail service') to find this.
just a thot