How can I export collected metrics data from web application using javamelody - java-melody

I have my web application in production mode using javamelody APIs, collecting daily metrics for last 2 months. I want to export the collected data metrics from production environment and ship the data to test environment to do report analysis. I see javamelody uses rrd files to save the historical data reports.
Is there any way I can export the historical data reports and import in my local environment web app? I dont want the pdf reports.
Many thanks

Related

What are my options for serving a dynamically generated PDF report as a download from an ASP.NET webserver?

I'm working on an application that stores a large amount of user-entered data in a MySQL DB regarding their performance in a competitive activity. We have a web interface that provides data visualization in the form of charts and graphs.
We're looking for a way to generate a PDF from a user's data using a provided UserID in our ASP.NET backend that users will be able to download.
The team's current thinking involves using a reporting tool like SSRS, generating the .rdl file from a template using the UserID, rendering the report as a PDF in the backend, and sending it to the user to download.
I'm very unfamiliar with report generation, but the more research I do, the more I wonder if a dynamic SSRS report is the right strategy for what we're trying to accomplish. If I were just doing this myself, I'd probably build a new view of the data using a print css stylesheet, but I'm not sure what the downsides to that strategy would be relative to building a report.

access shared data source by report designer, created in SSRS web portal

I recently started learning SSRS; I created a data source over SSRS web-portal; I am wandering whether such a data source can be accessed from /by Report Designer to create report and data sets.
Would anyone please help me understand the scope of such data source created in SSRS web-portal.
Thank you for giving your valuable time.
Generally you will create your shared data sources in Visual Studio/Report Designer and then publish them to the server. The end results is that you will have a data source available on the server (as it sounds like you have now).
By default data sources are not overwritten when you deploy reports, the idea is that you can have a data source on your development machine and as long as it has the exact same name on the server, when you deploy your report, it will look for the data source of the same name.
As it sounds like you created the data source directly from the web portal, you will have to recreate it on your development machine with the same name, when you deploy the report it will use the version on the server.
This also means that you can have a data source on your development machine, called dsSales for example and it might point to myTestServer\myDatabase and on the production server you could have a data source with the same name, pointing to myProductionServer\myDatabase (assuming the same tables etc exist on both) then you can test with data from the test server and when you deploy it will connect using the data source that connects to the production server.
I hope that clears it up a little.

Automate unattended running of an Access module periodically

I have an access database that uses data pulled from an API, this data export is conducted hourly.
I would like the access database to automatically run an update module (which imports the exported data and cleans it up), based on the same schedule the API exports my data, so Access will be current at all times.
The database I designed is used company wide. The accde and backend are both located on a windows 2008 server.
I just wanted suggestions on the best way to automate this, without the need of human intervention, preferably nor my local machine.
From rough research (accompanied by my own ignorance and stupidity) I’m assuming I need to setup a task scheduler on the server to launch access and run the module on open if FOS username is the local machines user (administrator).
I highly welcome any feedback as again, I just taught myself access these past two months and am horribly unknowledgeable on the implementation side.

Tableau custom web-connector

Goal
At the moment I use Tableau for generating reports based on the data I store at KnackHQ (online database service, which has a REST API). I now have to manually extract data as Excel (or .csv) from KnackHQ to upload to Tableau. I would like to automate this process.
Investigation
It seems there are several options to try:
Web-connector via ODBC (seems I need to write my own ODBC driver in my case, which can be tricky)
Export CSV (http://databoss.starschema.net/tableau-live-data-connection-to-csv-over-http-rest/)
OData (http://www.odata.org/)
Data Engine API (http://www.tableau.com/new-features/data-engine-api-0)
Any others?
Any ideas what can be the best method of such integration in terms of simplicity?
From Tableau 9.1 (which will be released in 2015 Q3 but beta is already available) has a new feature: Tableau Web Data Connector. This is exactly what you need, since:
You can build custom connectors using Tableau WDC Javascript API
Create & refresh datasources in Tableau Desktop using these connectors
Your administrator can import the connector. From that point Tableau Server can schedule and refresh the extracts based on your Web Service originated data
I would suggest to check out these post on how to write a new connector:
5 Things to Know about the Tableau 9.1 Web Data Connector
Google Spreadsheet with Tableau Web Data Connector
MongoDB Tableau Web Data Connector
Github Tableau Web Data Connector
You can also check official Tableau Documentations like how can you manage connectors: Web Data Connectors in Tableau Server
If you want to try out this before it gets released please follow the instructions on the Tableau Community web site here: http://community.tableau.com/message/327560.

MS BI Warehouse project

Is there any link or zip file where I could get whole MS BI warehouse project (sample) from starting to end? (2008)
Incremental load and even possible creating cubes too. What kind of problems one faced in real time projects, such things.
I could find things on you tube in parts but couldn't link it. Please help.
Rohan
I think the best reference implementation for the MS BI stack is Project REAL. According to Microsoft:
In Project REAL we are creating a reference implementation of a
business intelligence (BI) system using real large-scale data from a
real customer. The goal is to discover the best practices for creating
BI systems with SQL Server 2005 and to build a system that exhibits as
many of those best practices as we can. This project is not just a
demo —we are creating this system for ongoing operation. It is a
complete system, including daily incremental updates of the data,
large multiuser workloads, and system monitoring.
It contains:
A set of instructions for setting up the environment
Guidance on how to explore the implementation
A sample relational data warehouse database (a subset of the Project REAL data warehouse)
A sample source database (from which we pull incremental updates)
SSIS packages that implement the ETL operations
An SSAS cube definition and scripts for processing the cube from the sample warehouse
Sample SSRS reports
Sample data mining models for predicting out-of-stock
conditions in stores
Sample client views in briefing books for the Proclarity and Panorama BI front-end tools
You can download it here - http://www.microsoft.com/download/en/details.aspx?id=12134
you can get the AdventureWorks database and datawarehouse (with the cube) here: http://msftdbprodsamples.codeplex.com/
not sure about the SSIS packages