Please forgive the OT nature of this, but I've spent a lot of time googling etc and have found nothing. Perhaps someone here knows the answer, or can point me in the right direction.
I have a web appl that uses status code 401 (authorization required) to challenge the user for an id/password. The client then responds with the http-authorization header etc. All is well.
But, I want to make a more 'friendly' log on page: So, my server sends a page with status 200 (OK) with a simple form with two fields: userid/password. (The rest of the page content is the friendly fluff, not relevant to my problem)
In that case, when the user presses the submit button, the client doesn't build the http-authorization: header (because it doesn't know it should)
Is there a way (using javascript?) to tell a web browser (client) to build a specific header? This needs to work with MS IE6, and I'd LIKE it to work with FireFox...
I need to have the client send the http-authorization: Basic B64-encoded-uid:pwd header.
I can do the encoding "manually" in a script if there are no "built-in" encoding methods... but how do I actually get the client to include a specific request-header?
Thank you, Don
On Thu, 2005-12-01 at 12:32, Don Russell wrote:
I have a web appl that uses status code 401 (authorization required) to challenge the user for an id/password. The client then responds with the http-authorization header etc. All is well.
But, I want to make a more 'friendly' log on page: So, my server sends a page with status 200 (OK) with a simple form with two fields: userid/password. (The rest of the page content is the friendly fluff, not relevant to my problem)
Usually the way this is done is to turn off basic authentication and use something cookie-based instead. You can probably find a routine in whatever server scripting language you prefer to handle the login and checking if you don't want to write your own. A side effect is that you can add a 'logout' button to delete the cookie where the only way to get rid of basic authentication is to shut down the browser.
On 12/1/2005 10:51 AM, Les Mikesell wrote:
On Thu, 2005-12-01 at 12:32, Don Russell wrote:
I have a web appl that uses status code 401 (authorization required) to challenge the user for an id/password. The client then responds with the http-authorization header etc. All is well.
But, I want to make a more 'friendly' log on page: So, my server sends a page with status 200 (OK) with a simple form with two fields: userid/password. (The rest of the page content is the friendly fluff, not relevant to my problem)
Usually the way this is done is to turn off basic authentication and use something cookie-based instead. You can probably find a routine in whatever server scripting language you prefer to handle the login and checking if you don't want to write your own. A side effect is that you can add a 'logout' button to delete the cookie where the only way to get rid of basic authentication is to shut down the browser.
OK... thank you... that idea occurred to me, but it seems less secure. It seems like such a simple need: a way to tell the browser, "Here, use this page to prompt for the userid and password".
It's too bad that by providing a "splash page" to log on from, that reduces the overall security of the site.. maybe I'm missing something. :-)
Don
On Thu, 2005-12-01 at 13:08, Don Russell wrote:
I have a web appl that uses status code 401 (authorization required) to challenge the user for an id/password. The client then responds with the http-authorization header etc. All is well.
But, I want to make a more 'friendly' log on page: So, my server sends a page with status 200 (OK) with a simple form with two fields: userid/password. (The rest of the page content is the friendly fluff, not relevant to my problem)
Usually the way this is done is to turn off basic authentication and use something cookie-based instead. You can probably find a routine in whatever server scripting language you prefer to handle the login and checking if you don't want to write your own. A side effect is that you can add a 'logout' button to delete the cookie where the only way to get rid of basic authentication is to shut down the browser.
OK... thank you... that idea occurred to me, but it seems less secure. It seems like such a simple need: a way to tell the browser, "Here, use this page to prompt for the userid and password".
It's too bad that by providing a "splash page" to log on from, that reduces the overall security of the site.. maybe I'm missing something. :-)
What you are missing is that basic authentication is about as insecure as it gets since the login and password are passed in plain text on every request unless you run everything over https and there is no way to make a browser forget them other than exiting every instance. With cookies you can be more creative about what you put in it, how you encode, how long it's valid, etc. If security is a concern you should be using https anyway and cookies that only persist for the current session.
On 12/1/2005 11:19 AM, Les Mikesell wrote:
On Thu, 2005-12-01 at 13:08, Don Russell wrote:
I have a web appl that uses status code 401 (authorization required) to challenge the user for an id/password. The client then responds with the http-authorization header etc. All is well.
But, I want to make a more 'friendly' log on page: So, my server sends a page with status 200 (OK) with a simple form with two fields: userid/password. (The rest of the page content is the friendly fluff, not relevant to my problem)
Usually the way this is done is to turn off basic authentication and use something cookie-based instead. You can probably find a routine in whatever server scripting language you prefer to handle the login and checking if you don't want to write your own. A side effect is that you can add a 'logout' button to delete the cookie where the only way to get rid of basic authentication is to shut down the browser.
OK... thank you... that idea occurred to me, but it seems less secure. It seems like such a simple need: a way to tell the browser, "Here, use this page to prompt for the userid and password".
It's too bad that by providing a "splash page" to log on from, that reduces the overall security of the site.. maybe I'm missing something. :-)
What you are missing is that basic authentication is about as insecure as it gets since the login and password are passed in plain text on every request unless you run everything over https and there is no way to make a browser forget them other than exiting every instance. With cookies you can be more creative about what you put in it, how you encode, how long it's valid, etc. If security is a concern you should be using https anyway and cookies that only persist for the current session.
Yes, this is using SSL... so that should be OK. It just seems that the security aspects are moved to the application side of things, rather than the system side of things... and unless all cgis check for this cookie properly, there will be an exposure.
So, you're saying that https with cookies is at least equally secure as https with basic authentication?
As long as requests go through a cgi, I can check the cookie... but what if it's just a plain html document? I want to make sure the cookie is there/valid. (Thinking out loud.. maybe I can force all requests through a cgi)
I do like the "logout button" idea of deleting the cookie before it expires "naturally"....
Thanks, Don
On Thu, 2005-12-01 at 11:35 -0800, Don Russell wrote:
As long as requests go through a cgi, I can check the cookie... but what if it's just a plain html document? I want to make sure the cookie is there/valid. (Thinking out loud.. maybe I can force all requests through a cgi)
That generally (always, in my experience) makes it slow to use a server. Every request needs parsing in a more cumbersome manner, and you can't navigate back and forth through what you've cached because most webmasters seem to not care about letting caching work sensibly.
On Fri, 2005-12-02 at 02:11, Tim wrote:
On Thu, 2005-12-01 at 11:35 -0800, Don Russell wrote:
As long as requests go through a cgi, I can check the cookie... but what if it's just a plain html document? I want to make sure the cookie is there/valid. (Thinking out loud.. maybe I can force all requests through a cgi)
That generally (always, in my experience) makes it slow to use a server. Every request needs parsing in a more cumbersome manner, and you can't navigate back and forth through what you've cached because most webmasters seem to not care about letting caching work sensibly.
The slowness depends on how your cgi executes. If it is a perl script loading perl on every hit it will be slower. If you use php, mod_perl, fastcgi, speedycgi, java, etc. where the interpreter is loaded once for many hits you won't really see a speed difference. Some of the mod_perl application frameworks also hook into the apache authentication mechanism to work a little more directly. There are also a large number of apache authentication modules. I haven't looked at those recently but you might find one that lets your app supply a cookie which will subsequently be accepted directly by apache. As far as caching goes, it shouldn't make any difference. Anything with basic authentication set should not be cached anyway except by the local browser.
On Fri, 2005-12-02 at 07:57 -0600, Les Mikesell wrote:
The slowness depends on how your cgi executes. If it is a perl script loading perl on every hit it will be slower. If you use php, mod_perl, fastcgi, speedycgi, java, etc. where the interpreter is loaded once for many hits you won't really see a speed difference.
As well as how well you write your program... It does seem however, that nearly every dynamically generated site that I've come across behaves like it's on a 16 MHz 486.
As far as caching goes, it shouldn't make any difference.
Many things will not cache URIs with parameters.
e.g. http://example.com/cgi?some+parameters
In the belief that they might be caching something that they really shouldn't.
Anything with basic authentication set should not be cached anyway except by the local browser.
That actually was my main concern (no local caching). Time and time again I've used incredibly slow HTTPS sites where nothing is cacheable. I can't back track (nothing loads, or the server throws a wobbly). I can only navigate via the links on the page. Tough luck if the idiot webmaster made it impossible to go back to where you need to go.
On Fri, 2005-12-02 at 09:44, Tim wrote:
The slowness depends on how your cgi executes. If it is a perl script loading perl on every hit it will be slower. If you use php, mod_perl, fastcgi, speedycgi, java, etc. where the interpreter is loaded once for many hits you won't really see a speed difference.
As well as how well you write your program... It does seem however, that nearly every dynamically generated site that I've come across behaves like it's on a 16 MHz 486.
Poke around http://www.marketcenter.com or http://www.futuresource.com keeping in mind that just about everything there is dynamic and the web servers have to pull the data from other sources as requested. Usually the slowest part of the page is loading the ads from a third-party site. These happen to be mostly java-based, but php and mod_perl can do as well.
That actually was my main concern (no local caching). Time and time again I've used incredibly slow HTTPS sites where nothing is cacheable. I can't back track (nothing loads, or the server throws a wobbly). I can only navigate via the links on the page. Tough luck if the idiot webmaster made it impossible to go back to where you need to go.
Part of that is using sensible URL's. If you design the site so the URL's can be bookmarked you can usually backtrack. Plus, if you can keep track of what is sensitive and what isn't, you can put the bulk of your static content in an unprotected directory to let caching proxies work - especially for images and frequently used icons.
On Sat, 3 Dec 2005, Tim wrote:
On Fri, 2005-12-02 at 07:57 -0600, Les Mikesell wrote:
The slowness depends on how your cgi executes. If it is a perl script loading perl on every hit it will be slower. If you use php, mod_perl, fastcgi, speedycgi, java, etc. where the interpreter is loaded once for many hits you won't really see a speed difference.
As well as how well you write your program... It does seem however, that nearly every dynamically generated site that I've come across behaves like it's on a 16 MHz 486.
Don't confuse dynamic with badly written/managed/tuned. Many dynamic sites use PHP and other scripting languages in very naive ways with the nearly inevitable result of tanking the backend database under load.
Frequently the people writing the scripts have no formal training in programming and so have no idea about such basic concepts as caching, profiling, and big-O analysis.
As far as caching goes, it shouldn't make any difference.
Many things will not cache URIs with parameters.
e.g. http://example.com/cgi?some+parameters
In the belief that they might be caching something that they really shouldn't.
It is more than that. CGI/other scripts freqently fail to implement the more advanced sections of the HTTP standard that provide guidance on caching such as Cache-Control, Modified, Expires, Vary, ETag or even plain old 'HEAD'. In essence, they *TELL* the browsers to not cache by failing to do so.
Additionally, they often ignore the abilities to pass parameters in more transparent ways than '?' GET paramters and to cache generated content on the server when it is not expected to change rather than hitting the database repeatedly to do *exactly* the same thing over and over.
Take a look at http://newark.org/
That site is nearly entirely Java and mod_perl under the hood. With the exception of images and PDF, almost everything you are seeing is being generated from a database as needed (with an efficient caching system on the server).
Does it feel slow to you?
Benjamin Franz snowhare@nihongo.org writes:
Frequently the people writing the scripts have no formal training in programming and so have no idea about such basic concepts as caching, ... It is more than that. CGI/other scripts freqently fail to implement the more advanced sections of the HTTP standard that provide guidance on caching such as Cache-Control, Modified, Expires, Vary, ETag or even plain old 'HEAD'. In essence, they *TELL* the browsers to not cache by failing to do so.
For a dynamic page, you really do not want the browser caching!
Benjamin Franz:
Frequently the people writing the scripts have no formal training in programming and so have no idea about such basic concepts as caching, ... It is more than that. CGI/other scripts freqently fail to implement the more advanced sections of the HTTP standard that provide guidance on caching such as Cache-Control, Modified, Expires, Vary, ETag or even plain old 'HEAD'. In essence, they *TELL* the browsers to not cache by failing to do so.
Donald Arseneau:
For a dynamic page, you really do not want the browser caching!
It's not as black and white as that! Making someone reload a page in its entirety every time they go past it is not good for you or them. It's far better to have a sensible caching period, short enough for sites that's appropriate to, longer for sites that don't change that much.
It's the height of idiocy to have to keep on reloading some page that has not changed. I've come across sites where I've loaded a page, followed one link, immediately found it not to be what I wanted, backtracked a page mere seconds later, and had to sit through another half minute of it reloading. This sort of stupidity is typical of sites that think they're newsworthy, with dozens of adverts on the pages, smatterings of articles on the one page, etc.