How to implement Data push(server push) in worklist application - comet

I am planning to implement Server Push functionality in human workflow. Say when a task is assigned to a group and any of the group member updated the task, it has to update the task status automatically(without browser refresh) in other group members who are opened the worklist page.
How do I can achieve this? any ideas or suggestions will be appreciated.

If you want a Realtime Web Server there are a bunch of options for different technologies listed on this Realtime Web Technology Guide.
If you can provide a bit more information about your choice of server technology I can be more specific about a recommendation/technology match.

Related

Synchronization across different systems

I have 2 systems let's call them i and j. Each have it's own database.
Each have a registration page, where a user is inserted in a user table.
What is the best way to synchronize both tables, where if any user registers at system i it will be also registered at system j.
Notes:
I cannot read from each other databases directly.
I can do small changes in the code if needed and it will not affect the system performance or natural behavior.
I can create API's for both systems if needed.
I can add any tables or fields if needed.
I can create any cron jobs unless it will affect the performance of the system or server.
I'm using cPanel.
Technologies:
MySQL
PHP
REST API's
The fact that you list cpanel as a technology shows you're working with an inflexible budget hosting vendor. So it's unlikely they'll cooperate in setting up background tasks (cron jobs) to merge your user tables behind the scenes. (cpanel isn't a technology: it's a system administration user interface provided by hosting vendors who don't trust their customers' skills.)
So. you should design and implement a REST API in the code of both your apps to perform user registration and authentication tasks. You didn't show us the details of your app, so it's hard to design it for you. Still it seems likely you'll have to implement these operations:
PUT user
DELETE user
GET user
POST user to validate a user's password, etc. (Don't use GET to pass secret information: GET request parameters go into server logs.)
PATCH to update details of a user.
If you get the API working, whenever you create/retrieve/update/delete user information in one app, you'll use the API to change it in the other.
Your best bet would be to create a third app just for user management, and have both your existing apps use it. That way you're sure to have one coherent source of truth about users. But you can do it just within two apps.

Developing a A web-based real time dashboard

I want to consume kafka stream into mysql using Python; on top if which I want to build a realtime web based (web app) dashboard that will automatically be refreshed (ajax) on each data insert in the database.
After some searching, found a suggestion that ajax is not good for this purpose.
This post said websockets are better than ajax in terms of Performance.
Because I am not sure on whats the best way to achieve this So your expert advice is needed.
Thanks.
"I want to consume kafka stream into mysql using Python; on top if which I want to build a realtime web based (web app) dashboard that will automatically be refreshed (ajax) on each data insert in the database."
... (wince!) ...
Pretty-please find someone among your peers who can save you from yourself. (And is there any possible way that I can say this to you, such that you can "save face?" I can't think of any.) I'm not-at-all trying to have public fun at your expense. Please – "talk immediately to your manager. (S)He, surely, can help you."
I am certainly not an expert in this field, however my company uses Elastic/Kibana to read in from Kafka topics and display the data on a dashboard. This is just one of many routes you can take, but it works very well for us. You can read a little more into it here:
https://www.elastic.co/blog/just-enough-kafka-for-the-elastic-stack-part1

Recommendation for BigQuery Reporting/BI Tool

I work for web hosting company looking to integrate different data sources with BigQuery but the question now is what would be an ideal reporting/BI tool to get the data from BigQuery so proper/fast/easy retrieval/analysis/ reporting can be done with it.
I'm looking into the options suggested by google here: https://cloud.google.com/bigquery/partners/ but I was wondering if someone out there has possibly a more hands-on experience that could make a recommendation.
the company works with a mysql based billing system (with client, support, service data) which is the main source of info, along with other chat, cms and inhouse-developed systems that provide other sources of information that allow to maintain the web infrastructure where the business depends on.
Thank you.
It's really hard to answer this. Depends on the personnel you have at hand.
We are doing for idea validation mostly Data Studio.
Some personnel knows Tableau, but once you are out from GCP, all become a slow process, queries and interface updates in 30-60 seconds, as they all relay and store on their own the data.
We have wired some data to ElasticSearch as well, and we use Kibana.
But once it's all validated, we consolidated into our own Dashboards the reports. Mainly because we are mostly developers and can do the programming. If you have a data analyist or data scientist with their own tools, let them use what they are comfortable with.
Always do iteration and versioning, you as a developer should be driven by a good product manager who tells exactly what charts to build out.

Couchbase share views among developers (import/export functionality)

At first, this question appeared to be too trivial to me to actually require a Stackoverflow post. However, after executing many Google searches for the information, I am at a lost when trying to figure this out about Couchbase.
In Couchbase (I am using the 2.2 Community version), how do I share views among developers? Is there some sort if import/export functionality available? If not, then how does Couchbase intend for developers to share the views that they are using without needing to do manual copying/pasting? It is obvious that the code that a development team would write for querying Couchbase will require accurate view names. Without having a way to send a developer a view file, to accurately setup a Couchbase DB, how can it even be possible to develop with Couchbase locally as a team?
I'm sorry if I sound a little desperate or harsh here, but if it isn't possible to share views among multiple developers, then I don't see how Couchbase can be a viable DB solution for a team of developers trying to share database configuration, similar to how a team using an SQL DB would share schema files to set up the DB.
Several ways you can approach this:
1) Create views programmatically as demonstrated here in java:
http://tugdualgrall.blogspot.com.es/2012/12/couchbase-101-create-views-mapreduce.html
or here in node.js:
http://www.tuicool.com/articles/RvYbQn
2) Store all your views in your version control system (This is the option I use). If you are developing locally then only you need your personal view code, once they are working and your tests are all passing then you can check them in.
I assume you'd then be developing on an testing environment so yes sadly here you'd have to update the views either by hand or by using option 1.
You could also take a look at perhaps using this tool but only for views: http://www.couchbase.com/communities/q-and-a/how-bulk-import-design-docs-and-views-couchbase-server
This functionality currently is not available in the admin UI.
There is a defect/enhancement open Ability to import/export views MB-8436. You can leave there your feedback and vote for it so it will be included in the next release.
In the meantime you can use Design Document REST API
Also there is a workaround blog

Can Tableau be used in customer-facing and SaaS web applications?

I was hoping someone could help me answer a couple of questions regarding Tableau. I am not as familiar with the platform, but I have a client who is looking for a reporting/analytics/data visualization platform that they could use for many of the internal apps (for their employees) and external (customer facing internet with login) applications.
The driver is that each of their internal teams has used many disparate technologies such as SSRS, Crystal, custom ASP.NET controls (Kendo/Telerik, etc), but now they have the opportunity to choose a common platform that could serve most/all of the future reporting and data visualization needs for enterprise and customer facing solutions.
They are looking for a platform that provides everything from simple grids with basic filter/sort/group, all the way to rich charting and ad-hoc reporting with slicing and dicing of data.
They will not always be creating dashboards in these apps since they are customer-facing, but they may want to have dashboards for internal (intranet) apps. They will definitely want the ability to build true internal BI dashboards to report on data from all these online apps across all customers, to whom they provide their SaaS/customer-facing web apps.
One of our main concerns revolves around security of data, as some of these customer-facing web apps are multi-tenant, so we'd need to ensure that data is always filtered by the client tenant id. Also we have a very customized security model, with data driven roles, permissions that may prevent showing certain types of data (e.g. SSN, Salary, etc) etc.
Does Tableau fits this model, can it meet most/all of these requirements, or is it meant more for internal data?
It should be quite possible by setting up a reverse proxy that would front end your multi tenant web application. There is a document on how to setup Apache as reverse proxy with Tableau with/without SSL.
I am familiar with how to configure Apache as reverse proxy and so here are the details with Apache Web server on how to setup reverse proxy rules.
There may be some documentation for front ending with IIS/Nginx so you should do some googling by yourself.
You need to harden your webserver configuration by limiting access from the external firewall to read only pages and the internal user can access allpages. Since you mentioned that the external users are allowed access to readonly pages, I presume all the requests from external requests will be only GET requests and a few PUT/POST requests when users choose to use filters. So you can block external users from any request except GET. Exceptions should be made for the pages that allow applying filters and grouping.
In your mutitenant application make sure you refer to the tableau URL's by the apache server url that is exposed to the outside world. If any url not configured in apache is used, users will recieve a access denied error. You need to create a role that has readonly access to tableau pages for external users. To address mulitenancy you need to set a cookie or something to identify the tenant and something similar to identify the user. To filter SSN and some more information you can use mod_proxy_html which filters content. You can also use mod_security module of Apache to block SSNs and Credit Card Numbers.
References:
Configuring Apache Server as Proxy with Tableau
Apache mod Proxy documentation
Blocking POST requests
mod_security FAQs
Yes to most of your questions -- with just a little fine print.
First remember Tableau is primarily about visualizing data, so it is great for publishing readonly interactive views of data. If you want allow end users to edit data, you'll have to do that by another means. Fortunately, the Tableau JavaScript API lets you interact closely with Tableau with your custom Javascript code. So if your needs are mostly about visualization, but want want to be able to trigger some custom code to modify data in some of your apps, you should be fine. But Tableau is not designed for creating custom CRUD apps as a rule.
The great thing about Tableau server is that many people can learn to use it and publish their own visualizations -- even if they don't know how to program. That doesn't mean they will win visualization design awards the first time, or that they shouldn't learn something about how databases work if they want have good performance. But it does mean the people that know their data best can learn to design and publish their own visualizations without having to wait three months on a backlog queue so the one IT guy can change the color of a button or add a field. It still would be good to get good system, database and visualization folks to help train, organize data, set governance and security rules, optimize, etc, but business users can learn to be the ones with hands on control over how their information is presented. That's a good thing.
The security question has several moving parts, and usually there are usually good answers from Tableau depending on what you're trying to accomplish. Tableau server does support multi-tenancy using sites. There is fairly flexible permissions and group policy system. It can use SAML for authentication, and has several features providing access to specific to the user/tenant. It works with almost every database, and you can in some cases push your security enforcement to the database server -- SQL server for instance. There is a trusted ticket feature where you can defer some authorization decisions to another server, say a web portal server. Useful when Tableau visualizations are embedded in some other web page.
Most security use cases can be supported out of the box, but there are some complex custom access control situations that are tricky to implement currently in Tableau server. Nothing you've listed sounds out of the normal swim lane, but the only way to know whether your security model is too complex is to dive into the details. Hopefully they will release a custom access control API for users who want to extend it.
At the high level, you sure can use Tableau to build customer-facing dashboards. You can quickly build and deploy those and as others mentioned, you can iFrame them with Javascript APIs, you can customize most of it. But it doesn't provide complete flexibility for user interaction, which you can if you use other technologies. Other options include hand coding framework and then using charting applications.
For simple dashboards, Tableau would be the obvious choice if you have already bought core-licenses. But when looking at what's going on in the industry, Tableau will not be able to fulfill all needs.
If using Tableau
1. Building Charts/Tables/Visualization is a super simple, efficient way.
2. You can expose low grained data to customers, because of Tableau's propitiatory columnar database engine, you can potentially expose millions of records via a dashboard.
3. You can use Tableau's security and access control mechanism.
4. As other user mentioned, you can use trusted ticketing mechanism to integrate easily with other applications (portals etc).
Challenges with Tableau approach.
1. If you have late arriving transactions (in Internet world it's so common to mark a click as fraudulent after few days) with late arriving transactions, you have to have full refresh the extracts, which means if you are showing say 13 months worth of data, you have refresh it all, all the time. Now with bigData, business needs all data all the time, which means you would end up extracting millions of records, throughout the day.
2. Very little flexibility in user interactions, like menus,drop downs etc. you have to work with what's been provided by Tableau.
3. If you have multiple charts on same dashboard page, not so user friendly way to download underlying data.
4. Many other challenges, in laying out visualizations on dashboard page, as there is no easy way to control canvas with pixel control, white spaces etc.
You should be very careful, after analyzing your use case, whether Tableau would be the right product before you invest in it.
Tableau's primary power comes from its desktop tool for data visualization/exploration and not from pre-built dashboards.
Best of luck.
Since Tableau public is also based on Tableau, I assume that you can put your dashboards in public using your own Tableau infrastructure.