How to declare multiple Un-authorized URL's in apache Shiro configuration - configuration

I am trying out the Apache Shiro framework and I basically downloaded the setup from a project online. I managed to get it working but I am stuck at a really small issue. I want to make multiple JSF pages in my project to be accessed without any authorization.
The configuration currently looks something like:
authc = org.apache.shiro.web.filter.authc.PassThruAuthenticationFilter
authc.loginUrl = /login.xhtml
roles.unauthorizedUrl = /login.xhtml
Now I would like to add one more page to roles.unnauthorizeddUrl i.e. signUp.xhtml
I tried
roles.unauthorizedUrl = /login.xhtml,/signUp.xhtml
but that doesnt work. Is there a way to declare multiple unauthorized URL's in the config.

The roles.unauthorizedUrl is the Url to which the user has to be redirected in case user tried to access the protected / unauthorized url. So you only add one such URL, otherwise ambiguity will be raised to the framework to which url to redirect.
If you want unprotect any url use the below config in [urls] section
/login.xhtml = anon
/sugnUp.xhtml = anon

Related

Web API call not returning

I have a RESTful Web API that is running properly as I can test it with Fiddler. I see calls going through, I see responses coming back.
I am developing a tablet application that needs to use the Web API in order to fetch data or make updates in the repository.
My calls do not return and there is not a single trace in the Fiddler to show that my calls even reach the server.
The first call I need to make is to login. The URI would be this:
http://localhost:53060/api/user
This call would normally return some information about the user (such as group membership, level of authorization and so on). The Web API uses Windows Authentication, so the repository is able to resolve all these fields based on the credentials passed in. As I said, in Fiddler I see the three calls made to the URI as the authentication is negotiated between the caller and the server. The third call returns with a JSON object that contains all information generated from the repository as expected.
Now, moving to my client I have the following:
var webApiClient = new HttpClient(new HttpClientHandler()
{
UseDefaultCredentials = true
})
{
BaseAddress = new Uri("http://localhost:53060/")
};
webApiClient.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
HttpResponseMessage response = await webApiClient.GetAsync("api/user");
var userLoginInfo = await response.Content.ReadAsAsync<UserLoginInformation>();
My call to "GetAsync" never returns and, like I said, I see no trace of it in Fiddler.
Any idea of what I'm doing wrong?
Changing the URL where the Web API was exposed seemed to have fixed the problem. Thanks to #Nkosi for the suggestion.
For anyone stumbling onto this question and asking themselves how to change the URL of the Web API, there are two ways. If the simulator is running on the same machine with the Web API, the change has to be made in the "applicationhost.config" file for IIS Express. You can locate this file by right-clicking on the IIS Express icon in the Notification Area (the bottom right corner) and selecting show all websites. Highlight the desired Web API and it will show where the application host configuration file is located. In there, one needs to locate the following section:
<bindings>
<binding protocol="http" bindingInformation="*:53060:localhost" />
</bindings>
and replace the "localhost" name with the IP address of the machine where the Web API is running.
However, this approach will not work once you start testing your tablet app with a real device. IIS Express must be coerced into exposing the Web API to the outside world. I found an excellent node.js package that can help with that. It is called IISExpress-proxy.

Extjs application with Json-server not working fine

I have a small app and I am using Rest Proxy. I set up json-server https://github.com/typicode/json-server locally.
I have not changed anything in server settings. I am able to successfully GET data from server but when I try to create data like this
var people = App.model.myModel;
var ed = new people({"id": 2,"title": "test","body": "test"});
ed.save();
Error appears in browser console is
PUT http://localhost:3000/posts/11?_dc=1427464731634 404 (Not Found)
Can some one point out why it is trying to PUT data and not POST data ?
PUT is used to update an item, Not Create.
As you have specified an id value ExtJs will presume that you need to update the record rather than create it, therefore making the PUT request.
Most RESTful API's will provide GET, PUT, POST and sometimes DELETE, LINK Endpoints for each entity.
I found the problem my self. I was sending the "id" as well and it was looking for a post with Id 2, and obviously that doesn't exist.
var people = App.model.myModel;
var ed = new people({"title": "test","body": "test"});
ed.save();
Works perfectly

How to get custom-resource file after packaging Metro App?

I have a Metro application in which am using different service URLs for receiving the data.For this scenario I want to change service URLs after building my application into a package.I have followed adding resource files into my app as mentioned in MSDN sites and tested by using following code.
var resourceLoader = new Windows.ApplicationModel.Resources.ResourceLoader();
var resourceString = resourceLoader.getString("greeting");
Here am getting greeting resource value string in my app before packaging.After packaging am not able to see my resource files but am able to see default resource files like en-US,fr-FR etc but.
Can anyone suggest some solution to get custom-resource file after packaging?
The way I see it you need to add the resource files before packaging the app... after that's done, you can not additional resources... what you could do is getting the new service url from a service and save it locally as a setting or in your DB
edit: also, resourceLoader.getString("greeting").value; will give you the actual string, or "greeting" in case no resources were found

Retrieving information from a web page

My application is meant to speed up the retrieval of phone call information from our telephone system.
The best way to get this information is to create a new search on the telephone system's web interface and export the results to an Excel spreadsheet which my application then imports into a DataSet.
To get the export, from the login screen, the process goes as follows:
Log in
Navigate to Reports Page
Click "Extension Detail" link
Select "Extensions" CheckBox
Select the extensions (typically all the ones currently being used) from the listbox
Specify date range
Click on Export button
It's not a big job to do it manually every day, but, for reliability, it would be great if I can make my application do this automatically the first time it starts every day.
Since more than 1 person in the company is going to use this application, having a Windows Service do it would be even better.
I don't know if it'll help, but the system is Datatex Topaz Next Generation telephone management system: http://www.datatex.co.za/downloads/index.html#TNG
Can anyone give me a basic idea how to do this?
Also, can anyone post links (in comments if need be) to pages where I can learn more about how to do this?
I have done the something similar to fetch info from a website. I cannot give you a exact answer. But the idea is to send login info to the page with form values. If the site is relying on cookies, you can use this cookie aware WebClient:
public class CookieAwareWebClient : WebClient
{
private CookieContainer cookieContainer = new CookieContainer();
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest request = base.GetWebRequest(address);
if (request is HttpWebRequest)
{
(request as HttpWebRequest).CookieContainer = cookieContainer;
}
return request;
}
}
You should be aware that some sites rely on a session id being passed so the first thing I did was to fetch the session id from the page:
var client = new CookieAwareWebClient();
client.Encoding = Encoding.UTF8;
var indexHtml = client.DownloadString(*index page url*);
string sessionID = fetchSessionID(indexHtml);
Then I had to log in to the page which you can do by uploading values to the page. You can see the specific form elements with "view source" but you have to know a little HTML to do so.
var values = new NameValueCollection();
values.Add("sessionid", sessionID); //Fetched session id
values.Add("brugerid", args[0]); //Username in my case
values.Add("adgangskode", args[1]); //Password in my case
values.Add("login", "Login"); //The login button
//Logging in
client.UploadValues(*url to login*, values); //If all goes perfect, I'm logged in now
And then I could download the page I needed. In your case you may use DownloadFile(...) if the file always have the same url (something like Export.aspx?From=2010-10-10&To=2010-11-11) or UploadValues(...) where you specify the values as before but saves the result.
string html = client.DownloadString(*url*);
It seems you have a lot more steps than I did. But the principle is the same. To see what values your send to the site to login etc. you can use programs such as Fiddler (windows) which can capture the activity going on. Essential you just do exactly the same thing but watch out for session id etc. which is temporary.
The best idea is really to use some native way to fetch data, but if don't got the code, database etc. you have to do it the ugly way. You may also need a HTML parser to fetch the data (ups, you don't because you export to a file). And last but not least, keep in mind that pages can change and there is great potential to fail to login, parse etc.
Please ask for if you are uncertain what is going on.
ADDITION
The CookieAwareWebClient is not my code:
http://code.google.com/p/gardens/source/browse/Montrics/Physical.MyPyramid/CookieAwareWebClient.cs?r=26
Using CookieContainer with WebClient class
I also found some relevant threads:
What's a good tool to screen-scrape with Javascript support?
http://forums.asp.net/t/1475637.aspx
With a HTTP client, you need to do the following:
Log in, using cookies or HTTP authentication
Request a page
Submit form data
This means that you need some class or component in your program that can do HTTP, cookies, authentication and forms. With this, you do the same requests a user would do.

Switch to SSL using a relative URL

I would like to create a relative link that switches the current protocol from http to https. The last place I worked had something set up on the server so that you could make that happen, but I don't remember much about it and I never knew how it worked.
The rationale for this is that I wouldn't need to hardcode server names in files that need to move in between production and development environments.
Is there a way for this to work in IIS 6.0?
Edit:
I am using .NET, but the "link" I'm creating will not be dynamically generated. If you really want the nitty gritty details, I am using a redirect macro in Umbraco that requires a URL to be passed in.
Here's a simple solution in VB.NET:
Imports System.Web.HttpContext
Public Shared Sub SetSSL(Optional ByVal bEnable As Boolean = False)
If bEnable Then
If Not Current.Request.IsSecureConnection Then
Dim strHTTPS As String = "https://www.mysite.com"
Current.Response.Clear()
Current.Response.Status = "301 Moved Permanently"
Current.Response.AddHeader("Location", strHTTPS & Current.Request.RawUrl)
Current.Response.End()
End If
Else
If Current.Request.IsSecureConnection Then
Dim strHTTP As String = "http://www.mysite.com"
Current.Response.Clear()
Current.Response.Status = "301 Moved Permanently"
Current.Response.AddHeader("Location", strHTTP & Current.Request.RawUrl)
Current.Response.End()
End If
End If
End Sub
Usage:
'Enable SSL
SetSSL(True)
'Disable SSL
SetSSL(False)
You could add this to the Page_Load of each of your pages. Or you could do something like I did and create a list of folders or pages that you want secured in your global.asax and set the SSL accordingly in the Application_BeginRequest method. And this will work with relative links and the HTTP or HTTPS status of a page will always be what you tell it to be in the code.
I have this code in place on several websites. But as an example, if you go to https://www.techinsurance.com you'll notice it automatically redirects to http because the home page doesn't need to be secured. And the reverse will happen if you try to hit a page that needs to be secured such as http://www.techinsurance.com/quote/login.aspx
You may notice that I'm using 301 (permanent) redirects. The side benefit here is that search engines will update their index based on a 301 redirect code.
Which language/framework are you using?
You should be able to create your own function in which you pass in the relative page and you deduce from the HttpRequest object and the Server object (again depending on the language or framework) what the host and URL are and then just simply redirect to that URL but with https as a prefix.
Here is a good CodeProject article on doing this by specifying certain directories and files that you want to use SSL. It will automatically switch these to and from https based on your needs.
I've use this for a project, and it works really well.
This is the same answer I gave here:
Yes you can. I recommend this free open source DLL that lets you designate which pages and folders need SSL and which don't:
http://www.codeproject.com/KB/web-security/WebPageSecurity_v2.aspx
So you can setup a page to be secure in your web.config like this:
<secureWebPages encryptedUri="www.example.com" unencryptedUri="www.example.com" mode="RemoteOnly" >
<files>
<add path="/MustBeSecure.aspx" secure="Secure" />
</files>
</secureWebPages>
We ended up buying ISAPI Rewrite to perform redirects at the web server level for certain URLs. That's not quite the answer I was looking for when I asked the question, but it's what works for us.