This post presents an interesting idea, that of CSS variables. On the surface, this could be a brilliant idea, though there are some potential flaws.
Firstly, there is the issue of redefinition - if a variable is defined and then redefined. This is particularly pertinent if the variable has been used before it is redefined.
Then there is the issue of CSS injected via JavaScript. The final issue I will raise about this is browser support.
Browser support is an issue which is close to my heart and about which I feel strongly. I firmly believe that with progressive enhancement, sensible design and tolerance of some differences between browsers from the client, a website can be made to support multiple browsers with minimal effort.
New innovations are fantastic and should be encouraged, but changes in the CSS specs can cause issues. We cannot start using this sort of new feature until a major proportion of users are accessing your website with a browser which supports the new feature.
So, what can we do about this? It is hard to reconcile the need for general cross-browser support and the desire for improvements and more features in the underlying specifications we use to build websites.
As I see it, there are a number of things that can be done. Firstly, we can educate our clients that it is ok for there to be differences. Secondly, we could build websites in "the classic manner" and also include the CSS style rules introduced by updates to the specification - but this may not work that successfully for obvious reasons.
As always, us interface developers will generally overcome these issues with a combination of techniques based on the principles of building to web standards and that of progressive enhancement.
What do you think we can do to overcome these issues? How can we bring about changes and enhancements as wide ranging as CSS variables without compromising the installed base of users who do not support new features and who can take many years to upgrade?
27 April 2008
23 April 2008
When perfect is not perfect
I must recommend this post by Marcus Alexander. He moves the argument against pixel perfection - which I have already talked about in a previous post - forwards.
It is my belief that achieving cross browser pixel perfection across a wide range of different browsers costs a project a disproportionate amount of effort and distracts interface developers from the truly important aspects of a website which should be perfect. I am talking here about minor differences, generally related to the rendering of standard elements - form fields for instance.
We should be aiming for valid markup, high quality well organised CSS and unobtrusive object oriented JavaScript - we should never have JavaScript errors. We should be working towards best practices and we should be implementing web standards wherever possible.
We should be listening to what our clients want and delivering a high quality solution which successfully addresses the problems the client wants solved.
However, we should not be expected to work around every single minor layout issue or to change system defaults. Users understand how their default system controls work, they are inherently usable and accessible. A user also does not care if a bit of a text is a few pixels out.
Lets take an example from another aspect of life - television. Television producers will ensure that their program is executed perfectly with a fantastic script, perfect camera work, flawless sound etc. However, if you are viewing a program being broadcast in wide screen on a non-wide screen television, you will miss part of the picture - the edges are removed to fit onto the television. This is a good example as the producer can decide which part of the picture gets cut off.
I must re-iterate however, that it is vital to produce high quality work which conforms to industry best practices and is accessible to as many users as possible. There is a big difference between accessible to all users and perfect for all users.
As Marcus states, no end user of the website is going to be aware of small differences between browsers - and nor should they be. They will only be unhappy if the website does not operate or is so poorly laid out as to be unusable on their browser of choice.
I am not suggesting that anything will do. I am suggesting quite the opposite. We need to be perfect in most things. But as long as the differences are small, the basic site layout is not compromised and the full website is usable to all, achieving pixel perfection does not produce the returns its cost surely demands.
It is my belief that achieving cross browser pixel perfection across a wide range of different browsers costs a project a disproportionate amount of effort and distracts interface developers from the truly important aspects of a website which should be perfect. I am talking here about minor differences, generally related to the rendering of standard elements - form fields for instance.
We should be aiming for valid markup, high quality well organised CSS and unobtrusive object oriented JavaScript - we should never have JavaScript errors. We should be working towards best practices and we should be implementing web standards wherever possible.
We should be listening to what our clients want and delivering a high quality solution which successfully addresses the problems the client wants solved.
However, we should not be expected to work around every single minor layout issue or to change system defaults. Users understand how their default system controls work, they are inherently usable and accessible. A user also does not care if a bit of a text is a few pixels out.
Lets take an example from another aspect of life - television. Television producers will ensure that their program is executed perfectly with a fantastic script, perfect camera work, flawless sound etc. However, if you are viewing a program being broadcast in wide screen on a non-wide screen television, you will miss part of the picture - the edges are removed to fit onto the television. This is a good example as the producer can decide which part of the picture gets cut off.
I must re-iterate however, that it is vital to produce high quality work which conforms to industry best practices and is accessible to as many users as possible. There is a big difference between accessible to all users and perfect for all users.
As Marcus states, no end user of the website is going to be aware of small differences between browsers - and nor should they be. They will only be unhappy if the website does not operate or is so poorly laid out as to be unusable on their browser of choice.
I am not suggesting that anything will do. I am suggesting quite the opposite. We need to be perfect in most things. But as long as the differences are small, the basic site layout is not compromised and the full website is usable to all, achieving pixel perfection does not produce the returns its cost surely demands.
22 April 2008
Asynchronous JavaScript Part 5 - The JSquared AJAX Object
In my previous post in this series, I introduced some of my thoughts on AJAX. I will now go into detail as to how to use the JSquared AJAX object.
The AJAX object just like ADIJ is an instance based object. That is to say that for each AJAX request you wish to make, you create an instance of the object and then call a method on it to send the request. The same object can be reused or new instances can be created.
The AJAX object will require certain parameters to be provided and generally it is easiest to do so in the constructor, however, the object has methods for setting these values later.
Only one parameter is required and that is the URL the request is going to. The full list of parameters is:
URL - must be provided either in the constructor or using the setUrl method
method - the HTTP verb to be used for the request. Defaults to GET
onSuccess - the callback function if the request is successful
onFail - the callback function if the request fails
scope - the scope in which to run the onSuccess or onFail handler (the scope sets the value of this within the callback function). Defaults to the handler function itself
timeoutLength - the maximum time to wait for a server response to the request before timing out the request and calling the onFail handler. Defaults to 12 seconds
headers - an array of objects containing key and value pairs which are the additional headers to add to the request
Each of these parameters can be set once the object is created using a method.
To send the request, simply call the send method:
The send method accepts an argument which is the string of data to add to the request. This argument is optional.
As you can clearly see, the parameters are passed into the constructor as a JavaScript object using object notation.
If you provide an onSuccess handler, that will be called if the request completes successfully and will have the AJAX object passed in as the first argument to the function. This will allow all the properties of the request to be accessed.
If the request should fail, the onFail handler will be called. The first argument passed to the fail handler is the AJAX object. The second argument will be the failure code. The code can then be examined by comparing it against the value of the AJAX failure codes object which looks as follows:
If you do not provide an onSuccess or onFail handler, then the object will simply do nothing - there will be no error shown.
Once again, I have to say that it is as simple as that. This is a really powerful but easy to use AJAX object.
This is the end of the series on Asynchronous JavaScript. I will be writing more about the other features of JSquared in the near future.
The AJAX object just like ADIJ is an instance based object. That is to say that for each AJAX request you wish to make, you create an instance of the object and then call a method on it to send the request. The same object can be reused or new instances can be created.
The AJAX object will require certain parameters to be provided and generally it is easiest to do so in the constructor, however, the object has methods for setting these values later.
Only one parameter is required and that is the URL the request is going to. The full list of parameters is:
URL - must be provided either in the constructor or using the setUrl method
method - the HTTP verb to be used for the request. Defaults to GET
onSuccess - the callback function if the request is successful
onFail - the callback function if the request fails
scope - the scope in which to run the onSuccess or onFail handler (the scope sets the value of this within the callback function). Defaults to the handler function itself
timeoutLength - the maximum time to wait for a server response to the request before timing out the request and calling the onFail handler. Defaults to 12 seconds
headers - an array of objects containing key and value pairs which are the additional headers to add to the request
Each of these parameters can be set once the object is created using a method.
To send the request, simply call the send method:
var myAjaxRequest = new J2.AJAX( {url: "myAJAXUrl.html"} );
myAjaxRequest.send();
The send method accepts an argument which is the string of data to add to the request. This argument is optional.
As you can clearly see, the parameters are passed into the constructor as a JavaScript object using object notation.
If you provide an onSuccess handler, that will be called if the request completes successfully and will have the AJAX object passed in as the first argument to the function. This will allow all the properties of the request to be accessed.
If the request should fail, the onFail handler will be called. The first argument passed to the fail handler is the AJAX object. The second argument will be the failure code. The code can then be examined by comparing it against the value of the AJAX failure codes object which looks as follows:
J2.AJAX.FailureCodes = {
general: "xx1",
unauthorised: 401,
notFound: 404,
timeout: 408,
server: 500
}
If you do not provide an onSuccess or onFail handler, then the object will simply do nothing - there will be no error shown.
Once again, I have to say that it is as simple as that. This is a really powerful but easy to use AJAX object.
This is the end of the series on Asynchronous JavaScript. I will be writing more about the other features of JSquared in the near future.
21 April 2008
JSquared road map and update
I have literally been swamped by a request for a road map for JSquared. In response, I have created this wiki page.
I have some ambitious plans for JSquared and it may not all be possible - much depends on how much my wife puts up with me working late into the night. But my aims are stated there.
I will be releasing periodic compatibility updates in between these major releases.
The single most requested feature for JSquared thus far is documentation and it is high on my priority list. I am intending on providing this as a set of object models output using JSDocs and an accompanying guide. Following that, I hope to be able to get a JSquared website up and running full of example code.
A set of unit tests and indeed a full testing platform for JSquared is also very high on the priority list and progress is being made on this. The intention for JSquared 1.1 is to get the core functions unit tested.
FXSquared is making good progress with a basic FX module ready for some heavy duty testing. FXSquared is built around plugins to allow for maximum flexibility.
IE 8 support continues to improve with each commit of the code. I expect IE 8 support to match that of all other browsers for JSquared 1.1
Please use this post or the wiki on the current JSquared home to comment on the road map, especially the form that the documentation should take.
I have some ambitious plans for JSquared and it may not all be possible - much depends on how much my wife puts up with me working late into the night. But my aims are stated there.
I will be releasing periodic compatibility updates in between these major releases.
The single most requested feature for JSquared thus far is documentation and it is high on my priority list. I am intending on providing this as a set of object models output using JSDocs and an accompanying guide. Following that, I hope to be able to get a JSquared website up and running full of example code.
A set of unit tests and indeed a full testing platform for JSquared is also very high on the priority list and progress is being made on this. The intention for JSquared 1.1 is to get the core functions unit tested.
FXSquared is making good progress with a basic FX module ready for some heavy duty testing. FXSquared is built around plugins to allow for maximum flexibility.
IE 8 support continues to improve with each commit of the code. I expect IE 8 support to match that of all other browsers for JSquared 1.1
Please use this post or the wiki on the current JSquared home to comment on the road map, especially the form that the documentation should take.
17 April 2008
CSS reset and pixel perfection
This recent post from Jonathan Snook nicely sums up my feeling on CSS reset files.
I have often argued that CSS reset files will end up causing more problems than they solve. Indeed, I am not convinced they solve a real problem.
The problem seems to be that each browser I support does not apply the same default styling to some elements that I may use in my website.
The solution proposed by CSS reset files is to create a base line set of CSS rules which will on first inspection solve this problem. However, I am not sure I agree that the problem is as stated above.
I believe the problem is that each website I code looks and behaves differently and does not use the same set of default styles as other websites I have developed.
The solution I propose is resetting only the CSS styles I actually need in order to make the website look correct in as few styles as possible. Only then will I be interested in the difference between browsers. I would still use the * reset myself as I find that generally is useful and makes a fairly large difference when designing my CSS. I would not employ any more of a reset than that.
As an example, my website may not use any level 3 headings. In that instance, my CSS code will not try and select any 3rd level headings at all. If later during development I need to use a level 3 heading, I will use the tag and add the relevant CSS code. I have always employed this approach and it has so far never let me down.
For more details on my approach to structuring CSS, I suggest reading this post. Of course ideas change over time, but it is still relevant.
I always believe in code being as lightweight as possible, not purely for speed of download but for ease of maintenance and the retention of my own sanity!
The other point raised by the post I mention above is that of producing cross-browser pixel perfect layouts. This is a topic that probably deserves its own post if not a series of posts, however, I rarely strive for pixel perfection. I just do not believe it is that important any more. I would argue that users are becoming aware that websites will look very slightly different on different platforms. I definitely did not perfectly explain my position in my previous post about browser support so I will clarify it here.
In that post, for level 1 browsers (the only relevant browsers in this discussion) I state:
What I should have said was:
That more fully expresses how I feel about pixel perfection. Sometimes it can be done, sometimes it cannot. We should educate our clients that pixel perfection is not always important.
I have often argued that CSS reset files will end up causing more problems than they solve. Indeed, I am not convinced they solve a real problem.
The problem seems to be that each browser I support does not apply the same default styling to some elements that I may use in my website.
The solution proposed by CSS reset files is to create a base line set of CSS rules which will on first inspection solve this problem. However, I am not sure I agree that the problem is as stated above.
I believe the problem is that each website I code looks and behaves differently and does not use the same set of default styles as other websites I have developed.
The solution I propose is resetting only the CSS styles I actually need in order to make the website look correct in as few styles as possible. Only then will I be interested in the difference between browsers. I would still use the * reset myself as I find that generally is useful and makes a fairly large difference when designing my CSS. I would not employ any more of a reset than that.
As an example, my website may not use any level 3 headings. In that instance, my CSS code will not try and select any 3rd level headings at all. If later during development I need to use a level 3 heading, I will use the tag and add the relevant CSS code. I have always employed this approach and it has so far never let me down.
For more details on my approach to structuring CSS, I suggest reading this post. Of course ideas change over time, but it is still relevant.
I always believe in code being as lightweight as possible, not purely for speed of download but for ease of maintenance and the retention of my own sanity!
The other point raised by the post I mention above is that of producing cross-browser pixel perfect layouts. This is a topic that probably deserves its own post if not a series of posts, however, I rarely strive for pixel perfection. I just do not believe it is that important any more. I would argue that users are becoming aware that websites will look very slightly different on different platforms. I definitely did not perfectly explain my position in my previous post about browser support so I will clarify it here.
In that post, for level 1 browsers (the only relevant browsers in this discussion) I state:
All features of the website are fully functional. All content is available. The web pages will match the designs provided completely. The web pages will look the same across all browsers at this level.
What I should have said was:
All features of the website are fully functional. All content is available. The web pages will match the designs provided completely with the exception of system entities such as form fields and rendered fonts which may cause a layout to differ slightly. The web pages will look the same across all browsers at this level within the constraints of the different rendering methods for each platform.
That more fully expresses how I feel about pixel perfection. Sometimes it can be done, sometimes it cannot. We should educate our clients that pixel perfection is not always important.
9 April 2008
WOW - Seriously
I saw this post on Ajaxian and could not resist posting it here myself.
It is truly quite awesome!
Super Mario in 14Kb of JavaScript
It is truly quite awesome!
Super Mario in 14Kb of JavaScript
8 April 2008
outline:0
There is nothing more annoying for those who wish to navigate a website using the keyboard than the extensive use of outline:0 in CSS code. For a more detailed explanation of what this is and what it means, I suggest you read this blog post.
Although I luckily do not suffer from any form of disability and do not actually require any effort to be made by a web developer towards accessibility, I do like to browse using the keyboard on occasion and this can be a highly frustrating problem.
CSS reset files such as the one mentioned in the post can leave developers without a full understanding of the nuances of the platform they are developing for. It is at least in part for this reason that I dont use a CSS reset at all.
My message to all would be to not remove this extremely useful and built-in behavior but actually to enhance it. Try navigation the London 2012 website with the keyboard (a site I led the development on) to see how nice it is to have the outline behaviour enhanced rather than removed.
Although I luckily do not suffer from any form of disability and do not actually require any effort to be made by a web developer towards accessibility, I do like to browse using the keyboard on occasion and this can be a highly frustrating problem.
CSS reset files such as the one mentioned in the post can leave developers without a full understanding of the nuances of the platform they are developing for. It is at least in part for this reason that I dont use a CSS reset at all.
My message to all would be to not remove this extremely useful and built-in behavior but actually to enhance it. Try navigation the London 2012 website with the keyboard (a site I led the development on) to see how nice it is to have the outline behaviour enhanced rather than removed.
7 April 2008
Documenting JSquared
One of the biggest missing aspects from JSquared is documentation.
I have been looking at using an automated tool to build documentation from comments in the code (another thing which is currently lacking) and this is going to be a major push for me over the next 2 months.
I have almost settled on using JSDoc. I would be very thankful if the readers of this blog could give any thoughts on JSDoc or any alternatives that I should be looking at.
Thanks in advance.
I have been looking at using an automated tool to build documentation from comments in the code (another thing which is currently lacking) and this is going to be a major push for me over the next 2 months.
I have almost settled on using JSDoc. I would be very thankful if the readers of this blog could give any thoughts on JSDoc or any alternatives that I should be looking at.
Thanks in advance.
Safari 3.1 and IE8 Beta 1 support in JSquared
With the release of Safari 3.1 on PC and MAC, the handling of keyboard events has changed as detailed in this post on Ajaxian.
I am delighted to say that the auto filter object in JSquared 1.0 is fully compliant with Safari 3.1 and no changes to the code are required. This is the only object in JSquared with keyboard handling at present.
The big news recently though is the release of IE8 Beta 1. I am currently testing JSquared with this release and things are looking good. I expect there to be some minor changes to the code which will precipitate the release of a JSquared compatibility update. More news on this in the next few weeks.
I am delighted to say that the auto filter object in JSquared 1.0 is fully compliant with Safari 3.1 and no changes to the code are required. This is the only object in JSquared with keyboard handling at present.
The big news recently though is the release of IE8 Beta 1. I am currently testing JSquared with this release and things are looking good. I expect there to be some minor changes to the code which will precipitate the release of a JSquared compatibility update. More news on this in the next few weeks.
5 April 2008
Asynchronous JavaScript Part 4 - AJAX (a quick introduction)
In the previous 3 parts of this series on asynchronous JavaScript, I have talked at length about ADIJ. I suggest starting at part 1 to catch up!
AJAX is a term which has been bastardised to describe any sort of interactive behaviour in a web page. A number of years ago, these interactions were known as dHTML (dynamic HTML). Neither AJAX nor dHTML perfectly describe these types of interactions, but I prefer the latter name.
AJAX to me should mean Asynchronous JavaScript and XML. This is not perfect as it is often the case that XML is not appropriate and sometimes JavaScript, JSON, HTML or even plain text is required instead. Nonetheless, the acronym AJAX still has one definite meaning to me!
For the purposes of this post and the remainder of this series I will use it to mean any form of asynchronous HTTP request whose response will be handled using JavaScript code whatever form that response takes.
AJAX is a very useful and powerful technique invented by Microsoft for Outlook Web Access and later adopted by other browser vendors. When used in a fairly light touch manner, AJAX techniques can give the user a greatly enhanced experience.
AJAX involves sending an HTTP request to a web server and getting a response asynchronously which is handled in JavaScript. This means more data can be requested from the server or passed to the server without the user seeing the web page refresh.
An asynchronous request happens in the background - it allows other operations to be performed whilst the HTTP cycle is completed. You provide a callback function to the AJAX object which is called once the HTTP cycle is complete. I wont go into more detail around implementation and what it all means as there are many excellent tutorials already written. The W3Schools has one of their own.
Using AJAX it is possible do things such as post a form or update the content of a web page without the user going through a page refresh cycle. This makes the web page seem more like an application and can make it better for users to interact with.
Of course, JSquared has an AJAX component which is extremely simple to use yet offers flexibility and power. In part 5 of this series, I will walk through the AJAX object and then discuss how to use it.
AJAX is a term which has been bastardised to describe any sort of interactive behaviour in a web page. A number of years ago, these interactions were known as dHTML (dynamic HTML). Neither AJAX nor dHTML perfectly describe these types of interactions, but I prefer the latter name.
AJAX to me should mean Asynchronous JavaScript and XML. This is not perfect as it is often the case that XML is not appropriate and sometimes JavaScript, JSON, HTML or even plain text is required instead. Nonetheless, the acronym AJAX still has one definite meaning to me!
For the purposes of this post and the remainder of this series I will use it to mean any form of asynchronous HTTP request whose response will be handled using JavaScript code whatever form that response takes.
AJAX is a very useful and powerful technique invented by Microsoft for Outlook Web Access and later adopted by other browser vendors. When used in a fairly light touch manner, AJAX techniques can give the user a greatly enhanced experience.
AJAX involves sending an HTTP request to a web server and getting a response asynchronously which is handled in JavaScript. This means more data can be requested from the server or passed to the server without the user seeing the web page refresh.
An asynchronous request happens in the background - it allows other operations to be performed whilst the HTTP cycle is completed. You provide a callback function to the AJAX object which is called once the HTTP cycle is complete. I wont go into more detail around implementation and what it all means as there are many excellent tutorials already written. The W3Schools has one of their own.
Using AJAX it is possible do things such as post a form or update the content of a web page without the user going through a page refresh cycle. This makes the web page seem more like an application and can make it better for users to interact with.
Of course, JSquared has an AJAX component which is extremely simple to use yet offers flexibility and power. In part 5 of this series, I will walk through the AJAX object and then discuss how to use it.
3 April 2008
Future proofing
With updates to some of the most popular and widely used browsers soon to appear - namely IE 8 and Firefox 3 - the question has to be posed about how and when to support them.
It is a fairly simple matter to test your websites in the new versions of these browsers as there are beta versions of both available, however, it can be very time-consuming to fix issues and clients can find it a bitter pill to swallow.
The biggest problem here is a lack of overall transparency about how and when these browsers will be released. This issue is much much greater where IE is concerned.
We know that Firefox is slated for a June release and it is likely that the automatic update system will offer users the new version. It is therefore reasonable to suppose that there will be a fairly rapid surge in the number of users for Firefox 3.
As far as IE is concerned, we cannot be sure of a release date or the mechanism by which the update will be delivered. It is possible that there will be a split between IE 7 and IE 8 users for some time until IE 8 is pushed through automatic updates as a high priority update. This in a way is what has happened with the IE 6 to IE 7 transition.
I think it is reasonable for an interface developer to support 2 versions of a popular web browser (perhaps at different levels of support), but I do not like the idea of having to support 3 versions at any level, particularly when one of those is IE 6!
So, the question is, what to do? I am unsure at the moment, though I am considering testing my websites in Firefox 3 as of now and attempting a good level of support. As far as IE 8 goes, if its IE 7 mode works well, then that is a very good reason to not support IE 8 until it is actually released.
If Microsoft released a road map with release dates for IE 8 and the version afterwards, then I could plan my browser support strategy much more easily. So come on Microsoft, talk to me....
It is a fairly simple matter to test your websites in the new versions of these browsers as there are beta versions of both available, however, it can be very time-consuming to fix issues and clients can find it a bitter pill to swallow.
The biggest problem here is a lack of overall transparency about how and when these browsers will be released. This issue is much much greater where IE is concerned.
We know that Firefox is slated for a June release and it is likely that the automatic update system will offer users the new version. It is therefore reasonable to suppose that there will be a fairly rapid surge in the number of users for Firefox 3.
As far as IE is concerned, we cannot be sure of a release date or the mechanism by which the update will be delivered. It is possible that there will be a split between IE 7 and IE 8 users for some time until IE 8 is pushed through automatic updates as a high priority update. This in a way is what has happened with the IE 6 to IE 7 transition.
I think it is reasonable for an interface developer to support 2 versions of a popular web browser (perhaps at different levels of support), but I do not like the idea of having to support 3 versions at any level, particularly when one of those is IE 6!
So, the question is, what to do? I am unsure at the moment, though I am considering testing my websites in Firefox 3 as of now and attempting a good level of support. As far as IE 8 goes, if its IE 7 mode works well, then that is a very good reason to not support IE 8 until it is actually released.
If Microsoft released a road map with release dates for IE 8 and the version afterwards, then I could plan my browser support strategy much more easily. So come on Microsoft, talk to me....
Subscribe to:
Posts (Atom)