| The roads I take... | Weekly Status Report, W25/2007 >>

A possible idea for user agents

When I went to bed late last night, I was not completely satisfied with the blog rant I had just written. Not that anything would be wrong with what is in there, I stand by that. But just ranting cannot be the solution - there must be some way to get what Camino people want without making UA strings suck even more.

Up to now, there are several possible ways in which people deal with websites closing out people because of UA strings they "don't recognize":All those variants suck in some way and none strikes out as a good solution - not even evangelizing, which is good and noble but just doesn't work in enough cases.

But then, I realized, all those solutions are so 1990s, so static, so Web 1.0 - and we're all talking in terms of the modern, new, shiny, dynamic Web 2.0 all the time. So maybe that great new world may have a better solution for that problem as well. And actually, I think it really has. I began to think we could just combine all the methods above with some other tooling we have, make everything dynamic, and we have a cool, new approach! ;-)

Firefox has this nice feature of preventing phishing through lists of known phishing sites it dynamically updates from the web. Currently, there's a plan to do a very similar thing for completely blocking sites that offer malware in Gecko.

So, what about having a list of sites that need UA spoofing, dynamically maintained by our users, and dynamically downloaded and used by the web browser? That way, we would specifically only spoof our UA (by adding any token the site needs) on specific websites. The list would have the domain name where spoofing is needed along with what sort of spoofing, the client would follow those rules.
Of course, it's bad to do this automatically without telling the user that we are doing non-standard things - after all, the website might tell the user he is using Firefox even though he's using Camino. But then, there's this nice idea of info bars in the browser, which we could use for giving the user feedback about what's happening:
Image No. 16069
This is just a graphical mockup of how it would look, of course - I haven't implemented anything.

The "More Info..." button would open a new tab/window (depending on user prefs) with a page that carries all kind of info about our spoofing of UA strings on this website. The user can leave his comments there, find a contact where to nag the webmaster about this problem, etc. Of course, this is a page on our central site that also delivers the dynamic list and where our users can add spoofing for sites, change spoofing options for those sites, and similar things. This should be driven by the community and should combine the reporting system with evangelizing options.

Of course, several points are still open in this concept:And last but not least: Someone needs to implement this system.
I'm willing to help with bits and pieces where I can, but I'm not a big XUL hacker and I have very little time to work on yet another new project.

It would be very cool to find someone who could implement a system like that.
If we design it well, it can help lots of browsers, not only Minefield, Camino and SeaMonkey, but also any number of others who could implement the same system.
Actually, we may even be able to leverage that system for going back to really short and useful UA strings for all our browsers, including even Firefox. Who knows?

Beitrag geschrieben von KaiRo und gepostet am 23. Juni 2007 14:00 | Tags: mozconcept, Mozilla, UA String | 8 Kommentare | TrackBack

Kommentare

AutorBeitrag

Kroc Camen

aus UK

zitieren
ORE
Way to go stealing my idea :P http://home.kairo.at/blog/2007-06/the_fight_for_the_suckiest_ua_string#p4155
I don't mind. I think that what you are proposing should be more generally handled, otherwise you may need to add more and more info bars for each type of growing incompatibility.

Sites that are incompatible, in any way, with Firefox (and standards in general) could be managed, with various user-provided shims to fix them. These could be in the form of the most basic - a User-Agent spoof, all the way up to GreaseMonkey like scripts to patch bad behaviour.

The UI would be similar to what you have shown - "This website is known to not yet be fully compatible with Firefox. Firefox has taken some measures to correct the issue automatically, but cannot guaruntee this website will function correcty. Please click the button for more info"

The button would take you to a tab in the page-info dialog to show what shims have been used, and a notice to state that the user could consider contacting the website owner to improve compatibity with Firefox.
23.06.2007 16:59

Matt Nordhoff

aus Florida, US of A

zitieren
OOpera already does something like this
Opera lets users configure site-specific preferences (cookies, user-agent spoofing, and more), and it automatically downloads an override_downloaded.ini file that configures things (all user-agent fixes (and one "CompatMode Override") at the moment). It even auto-downloads a browser.js file that uses JavaScript to fix broken websites.

The problem is that it doesn't warn users when it does this. Your infobar idea is great.

I assume Opera's files are maintained completely by Opera, with no user-control except for reporting things. A Web 2.0-ish website sounds pretty neat, except that you have to remember there are stupid and malicious people out there. Let users submit and vote on problems and solutions, but someone has to be there to review things.
23.06.2007 18:34

Keith

aus the US

zitieren
OWell, if you guys create such a thing, I think it should be RDF. Create an RDF Seq with the items being links to the browsers' sites and contents being the UA string. Then you create a list of sites and rdf:about to which browser string to use (Firefox, Netscape, IE, etc.). Well that's my idea of a good way to do it.

Also, make it public domain. That would help the other browsers more than anything.
24.06.2007 10:54

dave

zitieren
Opotential UA solution
If I was you guys I would be encouraging Web Developers to stop testing and building websites with Firefox.

Instead they should be using GeckoDeveloperFox (okay terrible name), which is basically Firefox with all the useful web testing tools and extensions baked in and a new UA (adopted immediately by small-fry like Camino and Epiphany and later by Firefox) of Gecko/version/build (or whatever).

This means I can avoid wondering why the internet is so slow every time I turn caching off for testing. It means the web developer stuff won't be fighting for space with my Google toolbar. My personal surfing History/Bookmarks etc. won't get messed up with my work.

Maybe it's just the websites I frequent but it seems an amazing amount of people use Firefox just because of the amazing web development tools. Having a special build for these people could make this better, while streamlining the main release and helping people test their sites properly against Gecko and not just Firefox.

It may not work on it's own, but I think it could have a noticeable impact. If you want to influence people, make their life easier, give them a good reason to do as you ask.
25.06.2007 14:29

chithanh

zitieren
Odo it like document.all
Maybe it could be done like the undetectable document.all support.

Check for common Firefox detection routines like indexOf("Firefox") and modify their behaviour, possibly notifying the user of the problem.

Zuletzt bearbeitet von KaiRo am 28.06.2007 16:31

26.06.2007 03:59

Tony Mechelynck

aus Brussels (Belgium)

zitieren
OKonqueror does it, but not so smartly
FYI, Konqueror (as distributed with openSUSE Linux 10.2) does something similar but static: it comes with a user-settable list of "spoofable" sites, preloaded with the following (I'm not writing the UA strings in full but if you want them, email me):

computerworld.com: IE4 on W2K
fernuni-hagen.de: IE5 on Mac PPC
gmail.com: Safari on Mac PPC
google.com: Safari on Mac PPC
logitech.com: IE4 on W2K
merian.spiegel.de: Mozilla 1.7.3 on Linux x86_64
sco.com: IE4 on W2K

The dynamic solution you propose seems better in terms of flexibility and reaction time (if properly maintained), worse in terms of bandwidth and (I guess) manpower at some central site. I'd say: if it can be done, go for it. But should it be all centralised or should the user have some leeway (for edge cases not "yet" covered by the central database)?
03.07.2007 01:49

Da Scritch

zitieren
OChange alot of javascript first !
Just have a look to the latest Jquery.js library :

var b = navigator.userAgent.toLowerCase();

// Figure out what browser is being used
jQuery.browser = {
version: (b.match(/.+(?:rv|it|ra|ie)[\/: ]([\d.]+)/) || [])[1],
safari: /webkit/.test(b),
opera: /opera/.test(b),
msie: /msie/.test(b) && !/opera/.test(b),
mozilla: /mozilla/.test(b) && !/(compatible|webkit)/.test(b)
};


Gniiiii ? and why not document.all ???


As I said in French (here http://dascritch.net/blog.php/2007/07/05/818-javascript-comme-un-dialecte ), too much work is to do in Dom support and so much "well known" javascript librairies to be rewritten first.

Sorry my bad French
09.07.2007 16:43

Da Scritch

zitieren
OAow yes, think to modify
this stupid page : http://www.mozilla.org/docs/web-developer/css1technote/css1tojs.html

very very very ugly practices
09.07.2007 17:31

Kommentar hinzufügen