Tableau custom web-connector - csv

Goal
At the moment I use Tableau for generating reports based on the data I store at KnackHQ (online database service, which has a REST API). I now have to manually extract data as Excel (or .csv) from KnackHQ to upload to Tableau. I would like to automate this process.
Investigation
It seems there are several options to try:
Web-connector via ODBC (seems I need to write my own ODBC driver in my case, which can be tricky)
Export CSV (http://databoss.starschema.net/tableau-live-data-connection-to-csv-over-http-rest/)
OData (http://www.odata.org/)
Data Engine API (http://www.tableau.com/new-features/data-engine-api-0)
Any others?
Any ideas what can be the best method of such integration in terms of simplicity?

From Tableau 9.1 (which will be released in 2015 Q3 but beta is already available) has a new feature: Tableau Web Data Connector. This is exactly what you need, since:
You can build custom connectors using Tableau WDC Javascript API
Create & refresh datasources in Tableau Desktop using these connectors
Your administrator can import the connector. From that point Tableau Server can schedule and refresh the extracts based on your Web Service originated data
I would suggest to check out these post on how to write a new connector:
5 Things to Know about the Tableau 9.1 Web Data Connector
Google Spreadsheet with Tableau Web Data Connector
MongoDB Tableau Web Data Connector
Github Tableau Web Data Connector
You can also check official Tableau Documentations like how can you manage connectors: Web Data Connectors in Tableau Server
If you want to try out this before it gets released please follow the instructions on the Tableau Community web site here: http://community.tableau.com/message/327560.

Related

How to develop a connection for Power BI service to large AWS MySQL?

I recently discovered Power BI as part of our Office 365 subscription so am very new to it.
We have a MySQL database with about 5 million rows in AWS. I want to add this as a data source to our Office 365/Power BI service.
How to do this?
I see there is no content pack service that allows me to do this.
According to this SO question and answer, there is no direct way to do this: How to connect POWER BI web with AW mysql database?.
I also looked at using a Power BI Gateway to achieve this. There are two types: Personal and On-Premises. We don't have any Windows Servers, so this leaves the Personal option: https://powerbi.microsoft.com/en-us/documentation/powerbi-personal-gateway/
For Personal, the documentation at that link says "A personal gateway is not required in order to refresh datasets that get data only from an online data source" which is a little confusing given that this seems to be the only option for connecting to my online data source (maybe this document meant to say "from a supported online data source"?). It seems that I install this on a local machine in our office, connect to my AWS MySQL database, query/model on my desktop, then upload my results to our Power BI Service for the rest of our company to access. I schedule refreshes using the Personal Gateway. Is this correct? I hope this does not involve the transfer of millions of rows to/from desktop and/or Power BI Service?
p.s. I also considered developing something similar to the content packs that are provided for GitHub, Google Analytics, MailChimp, etc but there doesn't seem to be a "private" way to do these. Doing it this way seems to involve becoming a Certified Azure Developer (even though there is no Azure in this problem) and then making the solution public (which I obviously don't want to do): https://azure.microsoft.com/marketplace/programs/certified/apply/. If there is a way to develop my own "private" solution without the certification and publication process, I would consider that.
I would tackle this through Power BI Desktop. You will need a windows machine to install this on, and it will need the MySQL Connectors installed, ref:
https://stackoverflow.com/a/32746679/1787137
Then I would develop and publish your queries, datasets and reports using PBI Desktop.
Finally I would configure PBI Personal Gateway to schedule refresh of the published report datasets.
5m rows is not trivial but quite possible in this scenario. You will likely only need a selection of your tables and columns, that have analytical value.

How do I do real-time conversion of SQL Server data to a JSON format using ASP.NET and Visual Studio?

My overall goal would be to take data that I get from MS SQL server, and I would like for the data to export, real time, onto a JSON format so that the data can be integrated to the AM Charts Dataloader that would then appear in the front end as a line graph that would constantly keep trending as new information comes into the SQL Server.
I plan on having real-time information being forwarded from MS SQL server to the Front End Web Page. So basically, how would I convert data to JSON, and how would I make it so that the information is updated in real time as the server receives new information?
I'm using Visual Studio to load the SQL Server data and from there I want to integrate the JSON Data to the website. So far I have nothing other than the SQL Server connected to Visual Studio 2013.
A broad question, with many ways to implement. One being the below
Convert data to Json? You can use JSON.NET
There are lots of examples shown on the Documentation section. You can construct your object using the library and perform the needed operation.
"How would I make it so that the information is updated in real time
as the server receives new information?"
You can construct a sqlcachedependency.

Pushing MySQL inserts to SharePoint 2013 lists

Long version, what I've read and tried so far:
I'm trying to optimize a weekly task for some of our users. We have a CMS build using Angular that uses MySQL to store entries that are now reviewed within the CMS after which they write an evaluation in SharePoint 2013 (on prem SQL Server 2012 on one machine, SharePoint 2013 on another server all in the same domain).
The goal is to have the MySQL entries show up in a SharePoint list as soon as they are created. I'd be highly preferred if the arrival of a new MySQL entry could spark a 2013 WorkFlow (send email, update webparts, log changes to task-lists etc).
The CMS currently does have NuSoap installed to feed a mobile application, but SharePoint Designer has some issue with reading all the fields from the WSDL and I read that this is one of the preferred methods to obtain pushed data into workflows, but I could not get SharePoint to usa the call http web service with success. Adding the WSDL as data source did only include the index and did not display anything in a webpart.
An alternative option was using ODBC on the SQL Server to grab the data in a SQL view from the Linked Server and copy the new entries over when they arrive. I have managed to make a view that can be used as External Content Type in SharePoint Designer 2013 and than can successfully be displayed in SharePoint as an External List.
Given the nature of the submissions, the content cannot be altered in the MySQL database. But I do want users to be able to add notes and other fields in SharePoint.
I found in two (older) Sources (the one on Stack Overflow) that it is not possible to fire a WorkFlow when using External Content Types, since "SharePoint does not own the data", but SQL Server does, so you cannot attach a workflow or have a workflow be notified when an entry is added.
==
TL;DR version
How do you make a "copy" of MySQL data from a read only source into a SharePoint 2013 List?
Neither the business data connection tools (web services as data source) nor the External Content Type solutions seem to result into a solution with default SharePoint List with attached Workflows.
For this specific solution push or frequent updates are required.

Is Jasper reports suitable for MySQL based reports?

I have a situation where I need to generate reports from a Data Mart which is built in MySQl. I have to use open source tools only. Can you please suggest any tools that best fit my requirement? Is Jasper Reports right for this?
Thanks
There are so many Open source tool which you can use such as JaspeReport Server and Pentaho.
but I would prefer to use JasperReport server, It has all the capabilities like : -
To allow users to make reports and download them as Excel or PDF.
Drill downs
3.Crosstabs
Graphical Charts
Easy to integrate onto a web app type page full of these reports
Easy integration with our databases
Easy integration with Java and Eclipse.
You can find more about JasperReport server here
JasperSoft comes default with PostgreSQL database but you can connect through MySQL database.
Here you can see comparison of all other open source reporting based tools.

SQL Server reporting services..Best advise for integrating with other technologies?

I'm looking to implement SQL Server Reporting Services as our standard reporting platform in our company. We were trialing Crystal Reports, but alas it seems to be plagued by issues.
SQL Server Reporting Services looks to be a great product, but I have a concern or two.
I have some existing web apps in ColdFusion, and the backend is in MySQL. If I move forward with SQL Server Reporting Services, how should I set up my environment? Is there a JDBC connector or is ODBC the only way to talk to this?
How does it integrate for the user? Will I need to re-authenticate the user to view the reports? Will I need to put a link in ColdFusion to link to the Reporting Services system? Is there a way to make it seamless for the user?
Should I port all the backend to SQL Server to fully leverage the SQL Server platform? Should I convert my existing apps to ASP.NET, and make the entire platform SQL Server / ASP.NET?
It's not too bad in the fact that the existing apps / MySQL aren't of a size that is too big to port. So I guess just looking for some best practice advise to see if its okay to use the reporting services component on its own, or if I'm much better to consolidate everything into a Microsoft solution.
I think it'll be easier than you think!
Reporting Services will happily pull reports from any OLEDB or ODBC source, and MySQL has ODBC drivers, so there's no problem getting at your data through an SSRS report.
You can set up the authentication in a number of ways. If your users are already authenticated in your Windows domain, this will be easier:
SSRS needs to know who is viewing the report. It will allow anonymous viewing (if you enable it; it's off by default), but if you're using IE and are logged in to the same domain as the IIS server, it's completely transparent
SSRS can then use this identity to connect to the data source, or it can use another identity. This is configurable per report or per data source.
One thing you could do is embed your reports within iframes in your coldfusion pages. This would make the whole thing seamless. The reports are accessible by sending an appropriately formed URL to the server, so it's quite flexible.
As for changing everything to ASP.NET, you'll really only get benefits from that if you ever need to write (and integrate with) your own code to manage the reporting server, or write custom extensions (data providers, delivery extensions and the like), but in my experience this is so rare as to be not worth considering. Go with what you have for now.