by WebKeyDesign | Apr 11, 2006 | News & Trends
I have been thinking more and more about the downsides of today’s Internet. While the “always on” and “24 hour self-service” are repeatedly talked about, I find on many ocassions where these claims either turn out to be false or worse a severe let down. It is interesting to think that sites like Amazon.com who cornered the online retail market with their innovations such as 1-click ordering and customer reviews now no longer mean much. In a sense, the Internet has become stagnant and I am not sure if this partly due to Microsoft not really investing much in Internet Explorer or that much of the Internet has adopted a “me too!” mentality.
The rampant rise of the commercial aspects of the Internet have caused the Internet to grow immensely since the 90’s, but the cost of all this growth has created a major problem of signal to noise ratio. If today, I go looking for some specific piece of information on how to go about fixing a computer problem, if the problem is too specific I will almost never find a solution. Google and Yahoo will try their best to bring up some results, but most of all the results will be junk sites with ads. What is worse is that when something relevant does come up you usually run into protected or pay-for-information sites. These are sites which once offered their information for free but then decided to go member-only. Lastly there are the old links which point to nowhere. All the search engines have build up quite a list of outdated pages which are no longer retrievable at all.
Disappointed By Internet Shopping
Remember when you wanted to buy something and the Internet was helpful? A few years ago you could type in the model number and brand name, along with the word “review” and find instantly some helpful consumer comments on said product. These days that same search brings up hundreds of online shopping sites that sell the product, offer no consumer reviews, and most sites cannot even tell you if the product is in stock or how much it will cost to ship! Just try finding a review for simple ADSL Router or an inexpensive television set, and you will be hard pressed to find anything useful.
For whatever reasons, companies have chosen to use the Internet as their cheapest marketing tool, and the idea that providing helpful information to consumers is most likely an afterthought. Today, most companies that do provide forums, censor them heavily or let them linger into uselessness. Independent web site operators are also guilty of pushing their sponsor’s products or wanting members to pay for access to their archives. In fact many new sites specifically entice participation, so that at a later date they can go commercial and restrict access.
The only exception is the personal blog, but even that is being invaded by the rise of “professional bloggers” and commercial sponsored blogs. All of these changes though point to a less useful Internet and a challenge to search engines who want to remain relevant, as the search results are becoming more diluted every day.
by WebKeyDesign | Mar 29, 2006 | News & Trends
A couple of years ago I was telling my good friend, Manzabar, about how I thought Windows Vista was going to be a very tough release for Microsoft and Windows users. My only proof for this was the incredible growing pains which I and many other Mac OS X users had to endure through the many OS X releases that Apple came out with. By far the biggest problem for OS X, other than the Finder, has been the actual speed of the operating system. Everything on OS X seemed to be painfully slower. Most early users put up with it, because of the benefits of using modern features in OS X, for me it was JAVA. There were a ton of new JAVA apps that only worked well under OS X. As Apple improved on OS X, it became more evident that OS X had serious bottlenecks. There was the multi-threading of the FreeBSD layer itself, the MACH kernel design itself not being as fast as Linux, the UI changes that made many users curse the new Finder, and so on. But the major issue is and still is the graphics layer in OS X. When Apple implemented a new graphics system and replaced the old two dimensional QuickDraw, they slowed down the UI immensely. Instead of a window taking a few kilobytes of memory to draw, it now literally took something like three megabytes per window. The math operations alone for all the windows slowed down the main PowerPC cpu and made the entire OS sluggish to use. Apple worked with NVidia and ATI to offload more and more of the UI drawing functions to the graphics card and now OS X is very much improved, but it is still a work in progress, and many would argue that the OS did not speed up as much as the hardware got faster. As of today the G5 PowerMacs and the new Intel based models are a vast improvement on UI responsiveness.
This brings me to Windows Vista, and Microsoft’s first attempt to bring a real 3-D interface to Windows. Of course, since Microsoft does all the code for DirectX, and they have waited for video cards to become DirectX 9 compliant, their UI should have less problems than Apple’s. But as you can tell by Apple’s lastest OS X release, the UI is still being perfected and even after five years of trying to speed it up, Apple still is not finished tweaking it. Vista will have problems running on older hardware and I’m sure Microsoft will end up tweaking just as much as Apple in order to get the 3-D UI to run at acceptable levels. The difference though is that Mac users will put up with quite a lot and for some reason don’t seem to mind all that much in the end, but Windows users are not exactly all that forgiving. If Vista turns out to be slow, they simply will hold off on upgrading and just wait to purchase new hardware, which is the last thing Intel and AMD want to hear.
Then again I could be wrong and Microsoft might pull it off and deliver an incredible release, with an amazingly fast 3-D graphics system.
by WebKeyDesign | Mar 28, 2006 | Web Site Basics
It seems like every month, I notice a new search engine bot crawling my web sites and aggressively using up a lot of my bandwidth. Google’s own bot can easily take a gigabyte of bandwidth a month, if you have a decent size website with at least 300 pages of content. But AWStats does not identify all bots, so you have to look at the Hosts section and see how much bandwidth your top hosts are taking. An aggressive spider will appear at the top of the list. This will let you know the IP address of the host. Most spiders though use multiple IP addresses, so what you really want to know is the actual agent name. An easy way to track down this is to look at the actual webserver logs and search for the IP address you have listed in AWStats. In cPanel, there is the Latest Visitors script (under Web/FTP Stats) which gives you the last 300 visitors to your site. Once you find the agent name, then do a search on Google for it.
Most spiders will be documented by the sites that own them. In general it is a good idea to let spiders search your site, but if they take too much bandwidth or are making your site slower than usual, then you have to take some action to either slow them down or ban them from specific areas of your site or entirely. Depending on the spider’s documentation, you might be able to deter or reduce crawling requests using the robots.txt file. Some spiders obey only the meta tags in the html header. It is best to use the robots.txt file since this change is easier to do than editing all your html files.
If all else fails, you are left with blocking the entire IP range that the bot uses. This is a last resort option and you should be extra careful in figuring out the exact IP addresses to block since this will make your site unreachable to any of those IP addresses.
by WebKeyDesign | Mar 16, 2006 | SEO
Yahoo! has a new search tool for webmasters. Site Explorer lists out any site’s pages according to their popularity in the Yahoo! database. This is a good way to see how your site’s individual pages rank on Yahoo!.
You can also setup a sitemap and submit your site to Yahoo!. You can now submit feeds as well at Submit.Search.Yahoo.com/Free/Request.
For sitemaps Yahoo! accepts a plain text file at the root of your domain name. Each line should include only one url and you can name the file: urllist.txt or urllist.txt.gz (if you are using gzip compression).
by WebKeyDesign | Mar 7, 2006 | CSS
For the most part, font selection for the Web is limited. There are only a select few fonts that are available on all platforms, and even when we say cross-platform fonts, we really mean just Macintosh and Windows. We forget all about Linux, other desktop operating systems, and portable systems as well. Personally I tend to reach for Verdana, Georgia, Trebuchet MS, and Lucida Sans. Verdana is probably the best choice for web fonts and more preferred than Arial. I like to replace Times with Georgia as well, as Georgia looks cleaner and more distinct to me than any of the Times fonts. For medium to large headings Trebuchet MS and Lucida Sans are more appealing, than Verdana.
To learn more about type for the web, read Joe Gillespie’s All You Wanted To Know About Web Type. This article from Web Page Design For Designers is still a must read for web page design.
Once you know which fonts to use, you need to know what CSS can do for you. Garrett Dimon’s CSS Typography mentions some concepts to keep in mind when styling fonts. The most important idea to grasp being that white space and headings make web pages easier to read.