SSRS 2016 Object Alignment is off on Server Render - reporting-services

I'm trying to create floor plan maps with data interspersed throughout the map. When I'm in design mode AND preview mode in Visual Studio, the report appears with objects perfectly lined up as I expect. However, when I publish to the server and run on the server, object alignment is off. I can't find anything related to my issue or how to tell it I want things placed in a specific location when running on the server. I'm already using Rectangles to contain elements but that doesn't seem to help. Images below show what I'm running in to. I have Admin access to the server so if I need to make changes there, I can. I'm happy to share my .RDL file if I can figure out how to upload it.
Any ideas to fix this infuriating problem would be welcome.
Design mode:
Preview mode:
Render on the server:

Unfortunately, SSRS is pretty bad for something like this that requires precise locations. Not really a setting you can adjust to fix it, it's just not meant for this type of application. Additionally, the report manager(in browser) always messes up the formatting. I always recommend exporting the report to a .pdf to see how it looks before finalizing anything. With that said, if this is client facing -- I'd suggest adapting this to use a .pdf viewer or just exports and opens a .pdf. Cleans things up and gives you a uniform result every time.
Another thing I could suggest if you insist on using SSRS would be to re-do the report using Tablixes as opposed to rectangles. These tend to hold shape better and you could do some creative borders to produce a similar result.

Related

displaying spreadsheet/grid on a browser after loading from Excel

I need some help with picking the right tools for a project.
Would like to import a bunch of excel spreadsheets (preferably server side), join a bunch of columns, and display a single responsive, interactive grid/table/html5-spreadsheet on the browser to the end user. What are some of the better tools to go about this? Would Solr or Logagent fit the bill?
Is there a mechanism (in the tool for e.g.) so that what is being displayed to the end user on the browser via Data Tables is secure i.e. only data fields accessible / allowed for the user to view are only displayed?
You question is too generic to provide a detailed answer.
Whenever I need to have Excel functionality in the browser I use SheetJS:
http://sheetjs.com/
Check out their demos:
http://sheetjs.com/demos/manifest.html
Hope this helps.

Using a database in html without a server

I have a question regarding the client-side possibilities of a html webpage. I am currently working on my masters-thesis in aerospace engineering and plan to develop a design-catalog for inspiration. I chose html for this purpose, because I have a basic understanding, want to learn more and love the possibility of changing the style (e.g. width, height, padding, …) of all elements with a single command.
Unfortunately, there is no way of installing a local server or any other kind of software on the university computers. Besides, there is no way I get my colleagues (supervisor, Professor) to run additional software just to look at my catalog. Publishing the webpage on the intranet or internet is totally out of the question. The only possibility I have is using a browser (Firefox) and an Editor.
I thought about two different approaches:
Consider having a (Windows) folder containing images. Is it in any way possible to load all images of this specific folder into a webpage without using a server-side or installing any kind of software?
This would be even better: Is it in any way possible to make a website load data from a database into its own html code? I would like to be able to load only specific elements, e.g. all Designs with 2 Degrees of Freedom, or only load all images.
Any kind of database! I don't care if it is mySQL, MariaDB, Excel or even a CSV file or something completely different.
Security of the code is not an issue. The data is sensitive, but I have full control over the database and code. It is a no point planned to publish the webpage or host it somewhere.
I would appreciate any kind of comment whether this is in any way possible, or whether I definitely need a server (e.g. xampp) to realize my approaches. If there is no possibility I would need to implement every image by hand or scratch the idea of the design catalogue.
Thank you in advance!
Best regards, REn0

The selected report is not ready for viewing. The report is still being rendered or a report snapshot is not available. (rsReportNotReady)

I am encountering the Error mentioned on subject everytime i use the interactive sorting on the exported .HTML file.
Here's the scenario. I have a report(with an interactive sorting column) which i created a subscription to run and generate an .HTML format every 2 minutes.
I would like to know if the interactive sorting feature will still work on the .HTML that was generated? Let me know your thoughts and tricks if ever.
Thanks.
Try removing compatibility view setting for intranet sites if you can.
Also check to see if your browser and os support the interactive features of SSRS here, there is a table at the bottom of that page.
There is a long rant here that discusses the problems with rendering SSRS report in HTML. It also show some workarounds you can do with jquery.
If all that fails, just render your reports in Excel, your users can filter it and sort it any way they want.

get data from website not showing up in html mark-up code

I am trying to retrieve data from a website for which the parameters that you need to define does not show up in the url ie. http://www.vmm.be/webrap/ibmcognos/cgi-bin/cognosisapi.dll?b_action=cognosViewer&ui.action=run&ui.object=%2fcontent%2ffolder[%40name%3d%27Water%27]%2ffolder[%40name%3d%27Afvalwater%27]%2freport[%40name%3d%27Individuele%20analyseresultaten%20per%20RWZI%27]&ui.name=Individuele%20analyseresultaten%20per%20RWZI&run.outputFormat=HTML&run.prompt=false&ui.backURL=%2fwebrap%2fibmcognos%2fcgi-bin%2fcognosisapi.dll%3fb_action%3dxts.run%26m%3dportal%2fcc.xts%26m_folder%3di5DDA04E5A00C4B6AB6DF44BB4FAD7CEC&p_RwziNr=51&run.prompt=false
how can I extract the data for different years and parameters in a programmatic way?
I am using matlab's urlread but since the data I want to import does not show up in the html-code (I have checked this with the Web Developer Toolbar in FireFox), nothing is being read in. I have no experience with websites, only matlab and c-programming, so I have no idea how the data can be shown in the browser if it is not showing up in the html-source code so could someone point me to the right direction on how to get this job done? Is it at all possible? I hope so because I will have to repeat this for around 500 measuring stations, each 10 years so I am not planning on copying the required data manually as I did before when I just needed one station.
it turns out it is not possible to do what I want in Matlab. I however did manage to get the required data programmatically by using a combination of Selenium with C# and chrome driver. It's slow, but it's working and I can do other stuff in the meanwhile so I can recommend it to everyone who is downloading data from servers in a tedious way.

How to take screenshot of rendered HTML page

Our web analytics package includes detailed information about user's activity within a page, and we show (click/scroll/interaction) visualizations in an overlay atop the web page. Currently this is an IFrame containing a live rendering of the page.
Since pages change over time, older data no longer corresponds to the current layout of the page. We would like to run a spider to occasionally take snapshots of the pages, allowing us to maintain a record of interactions with various versions of the page.
We have a working implementation of this (Linux), but the snapshot process is a hideous Python/JavaScript/HTML hack which opens a Firefox window, screenshotting and scrolling and merging and saving to a file. This requires us to install the X stack on our normally headless servers, and takes over a minute per page.
We would prefer a headless implementation with performance closer to that of the rendering time in a regular web browser, but haven't found anything.
There's some movement towards building something using Mozilla source as a starting point, but that seems like overkill to me, as well as a maintenance nightmare if we try to keep it up to date.
Suggestions?
An article on Digital Inspiration points towards CutyCapt which is cross-platform and uses the Webkit rendering engine as well as IECapt which uses the present IE rendering engine and requires Windows, natch. Nothing off the top of my head which uses Gecko, Firefox's rendering engine.
I doubt you're going to be able to get away from X, however. Since CutyCapt requires Qt, it requires either X or a Windows installation. And, similarly, IECapt will require Windows (or Wine if you want to try to run it under Linux, and then you're back to needing X). I doubt you'll be able to find a rendering engine which doesn't require Qt, Gtk, GDI, or Cocoa, and therefore requires a full install of display libraries.
Why not store the HTML that is sent out to the client? You could then use that to redisplay in a webbrowser as a page to show what it looked like.
Using your webanalytics data about use actions, you could they use that to default the combo boxes, fields etc to the values the client would have had, even change the CSS on buttons, etc, to mark them as being pushed.
As a benefit, you don't need the X stack, don't need to do any crawling or storing of images.
EDIT (Re Andrew Moore):
This is were you store the current CSS/images under a version number. Place an easily parsable version number in a comment in the HTML. If you change your CSS/images and use the existing names, increment the version number in the HTML output sent out.
The system that stores the HTML will know that it needs to grab a new copy and store under a new number. When redisplaying, it simply uses the version number to determine which CSS/image set to use.
We currently have a system here that uses a very similiar system so we can track users actions and provide better support when they call our help desk, as they can bring up the users session and follow what they did, even some-what live.
you can even code it to auto-censor sensitive fields when it is stored.
depending on the specifics of your needs perhaps you could get away with using one of the many free webpage thumbnail services? snapcasa, for example lets you generate thousands per month / no charge no advertizing .. (not ever used, just googled 'free thumbnail service') to find this.
just a thot