26 November, 2007

Are tasers safe?

TASERs have been in the news a lot recently. A student was tasered at John Kerry's speech just a few months ago, and more recently an Oakland cop killed someone with a taser. Then there's the Vancouver guy who was tasered to death in the airport, and in the UK, one man actually burst into flames after he was shot with a taser. He died, too, of course.
Perhaps even more frightening is the Georgia police that unnecessarily tasered a man four times in 40 seconds killing him. Seeing such unnecessary tasering just makes one's stomach churn. Especially when you realize it is everywhere, even when someone is just stopped for a speeding ticket.
The issue, of course, is that because it is considered nonlethal, police seem more apt to utilize it in a situation. But with all these recent deaths, one starts to wonder if it really is nonlethal. That's why the Defence Science and Technology Laboratory (DSTL) in the UK decided to do a study specifically to find out how harmful tasers really are.
Because the cause of death from electric shock generally seems to be a problem with the heart, the researchers focused in on the possibility of cardiac arrhythmia occurring with the use of M26 and X26 commercial tasers. They chose guinea pig hearts to test on, due to the similarity of its electrocardiographic-wave configurations in comparison to human hearts.
The results were surprising for me, though I suppose that's only because I was unfamiliar with similar findings from the past. The current densities of both devices had to be increased by at least a factor of 60 before erratic heartbeats were seen. In other words, tasers are safe when used properly.
There is apparently a wide safety margin between the intensity of a taser strike and the level at which a human heart would beat irregularly. The research team did, however, warn that "consumption of alcohol or some drugs, or an existing heart condition" might "reduce this safety margin in some individuals".
"I have many concerns about TASERs, but the induction of a cardiac arrhythmia appears to be less of a problem," stated Brad Roth, associate professor in the department of physics at Oakland University in Rochester, Michigan. So it remains to be seen why, exactly, so many taser deaths have occurred recently.
Perhaps it just in misusing them that the problems occur? But even if someone is tased four times in forty seconds, how does that compare to the finding that tasers are sixty times too weak to cause irregular heartbeats? These are very unclear questions, and I certainly don't have any answers.
What do you guys think?
Update: PhysicsWorld has an article on today's study, and also I was informed that the UN now considers tasers as torture.

15 November, 2007

19 Planes Virtually Destroyed In US Skies

The administration likes to say that they're doing a great job with homeland security, since no significant attacks have been made on US soil since 9/11. But today that story has been proved to be nothing horseshit.
The GAO released a report today that detailed how teams at 19 different airports were able to smuggle bomb parts and liquid explosives right past security to get onto the planes. These bombs were built out of $150 worth of materials commonly available at hardware stores, and a video they showed in session with congress today showed just how powerful such an explosion would be. (I will upload the cspan footage onto youtube shortly and post it here.) I think it was clear to all who watched that the resulting explosion could easily destroy any plane it went off in, although TSA chief Kip Hawley just had to point out that since such an explosion has not been tested inside a plane, "no one really knows" what would happen if it went off in mid-flight. (I guess they'll have to check Mythbusters for that one.)
While it is clearly troubling that in 19 attempts, 19 successes were had, perhaps it will be even more troubling to realize that when they saw how easy it was to get through, they tried AGAIN, but this time triggered the special screening to see what would happen. They STILL got through in 19 out of 19 tries.
But believe it or not, this is not the worst part.
The worst part is that last year, the GAO did the same exact thing and got the same exact results. There has been no improvement in the intervening year, even though last time, Congress told the TSA to train their people to protect against this threat. TSA chief Hawley insists that "as of right now, our airports are safe. We have fixed the problem", yet immediately afterward, the GAO official (sorry, but I didn't catch his name) retorted that "we used the same exact methods we did last year, and had absolutely no problems getting through", though he admitted he also used new techniques in some cases and additionally smuggled in liquid explosives, which they didn't do in last year's exercise.
What this means, in a nutshell, is that we apparently can't guard against this stuff. This problem was brought to the attention of Homeland Security and its TSA division last year, and were told to at least fix this breach, yet a year later, no progress has been made. It remains obvious, therefore, that no progress can be made, since I refuse to believe that these people actually want security to be so easily broken.
The simple fact, then, is that anytime terrorists want to, they can blow up passenger planes in American airspace. With all that this gov't has tried to do to limit such threats, nothing has worked. The fact that we haven't had a major terrorist attack recently, I then conclude, is not because of hotshots behind the scenes who are taking out the badguys Bond-style, but rather because the would-be terrorists simply haven't yet had the willpower to see such plans through. If they did, then numerous planes would already have been destroyed.
Or do you think I'm being too harsh? What do you guys think?
Update: Summary of the GAO report is available at their website. Also, apparently CNN did a short piece on this as well.

09 November, 2007

What is Ajax?

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.


AJAX is one of those things that every webmaster needs to know about. In today’s e-commerce world, every move toward making a site easier to navigate and interact with is a big plus, and using AJAX is one method of doing just that.
AJAX stands for Asynchronous JavaScript and XML–in english, that just means that xml and javascript are being used on a page to load content asynchronously with the rest of the page. In other words, a webpage that has an ajax component can fully load and remain static even while the ajax component is still processing data between the user and the server.
The beauty of AJAX is that it allows a user to do things without requiring the entire page to reload. This makes it a powerful way of improving the speed and functionality of your site. But, perhaps more importantly, it makes your site seem more professional. As end-users, we are so used to seeing interactivity cause page refreshes that whenever we interact with a page and it doesn’t refresh, that by itself can make it feel professional.
And yet, strangely enough, AJAX is not some complicated new technology. It’s just a new way of processing information, and its power, though perhaps more psychological than anything else (saving page loads isn’t that big a deal, when you stop and think about it) is sufficient to make it a staple of all the new web 2.0 technologies.
Writing AJAX into your site is not difficult if you’re already familiar with javascript. In most cases, a good webmaster will be able to integrate AJAX into a site without too much technical trouble. (A good tutorial on writing AJAX is available here.) The hard part is not writing the code; it’s recognizing which parts of your site should integrate AJAX.
Try to find an interactive part of your site that would not lose functionality by making it just a small box on your main page. For some, this might be a link to subscribe to your newsletter; although if you are marketing a product, you should keep in mind that a ‘subscription confirmed’ page is not fluff, but is an important and viable marketing stategy. Another idea might be to integrate a sales presentation on your site if you have a product to sell, or a virtual home tour, if you are a realtor. Although these functions could also be done in flash, AJAX is preferable because of standards. Nearly everyone has javascript available. Also, flash loads much more slowly, since the entire program has to be uploaded before it can even begin.
As you can see, AJAX is a powerful tool that webmasters can use to increase the speed, functionality, and professionalism of their site. I hope that these ideas will convince you to start working on implemenmting AJAX on your own site. Oh, and if any of you care to share, please post any examples of AJAX you’ve used on your website in the comments. (c;
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

08 November, 2007

Clean and Simple Code for Good SEO

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.


Good SEO often takes a lot of effort, but some of the most important parts of SEO best practices are actually quite simple.

Clean & Simple Code

Making your site validate really isn’t that difficult if you code appropriately. And believe it or not, code that validates is good for SEO. By using semantical markers the way they were meant to be used, search engine spiders will be able to better understand your site, and will then crawl it more efficiently.
But what does this mean, exactly?
First, make sure you have a good webmaster. Don’t use wysiwyg html generators–they never validate. Your webmaster should not only know what validating means, but shold be able to create validated html from scratch.
Second, using semantical markers means to use header tags (

) rather than text that is simply bolded and enlarged. This is fundamentally different from how you might make headers in a word document. It also means to use emphasis and strong tags rather than italics and bold tags.
Third, always link to external files for javascript and css. Remember to try to keep your actual content as the main thing that spiders see when they crawl your page. If they have to wade through lines of javascript and style definitions, it may give up before it even reaches your content.
These three simple rules will help your SEO significantly without requiring too much effort on your part. Remember: clean code is always good.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

06 November, 2007

Using robots.txt for SEO

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.

 

Robots Exclusion Standard (also known as robots.txt protocol) is the agreement whereby search engines will not read or index certain content on your site, even though it is freely available for the public at large to view. The way it works is that a robots.txt file will instruct search engine spiders on which pages you don’t want it to read, and assuming the search engine is acting in good faith, it won’t crawl those pages. Obviously, this is not a reliable way of hiding data; you must have the cooperation of the search engine for it to work, and even pages that aren’t indexed are still available for viewing by anyone with a web browser. Yet it has its uses.

Blank robots.txt

Before I go into using a robots.txt file, I should mention that to maximize your SEO potential, you should consider allowing everything on your site to be read and indexed. By being fully open, you give yourself more chances to do well in search engine queries. Technically, you can accomplish this by not putting in a robots.txt file at all, but putting a blank robots.txt file will accomplish the same thing and have the added benefit of not creating any errors in your logs. Remember that search engines will look for robots.txt whenever they crawl your site; if it is not there, then this is logged as a 404 error. If you include a blank robots.txt file, these 404 errors will disappear, and you can still rest assured that your entire site will be crawled.
A blank robots.txt must be placed in the root directory of each subdomain, with the following text:

User-agent: *
Disallow:
The asterisk tells the spider that the fllowing rule should be followed by all user-agents (i.e. search engine spiders), and the blank disallow means nothing is disallowed.

What should you disallow?

If you have copyrighted images that you made yourself, and you don’t want people grabbing them off google, you might want to disallow your /images/ folder. If you use cgi, disallowing /cgi-bin/ might be useful, as they don’t tend to do well for SEO anyway. If you have a support page, you may not want that info to show up in web searches. If you are mirroring content on multiple pages, you definitely want to disallow crawling for all but one of those pages. All of these are good examples, and you may come up with many more.

How to Disallow Portions of Your Site

The code is actually quite simple. Just list whichever pages you want disallowed as follows:

User-agent: *
Disallow: /images/
Disallow: /cgi-bin/
Disallow: /dontcrawlme.html

Will adding robots.txt help my SEO?

An unsolved question in SEO circles now is whether or not including a robots.txt file will increase your ranking in the search engines. It’s possible, and some people think they have evidence for it. But the answer is not really clear, as it isn’t the kind of thing that’s important to test intensively enough to tell for sure. After all, it’s not difficult to create a blank robots.txt file, and you certainly can’t lose anything by including it. So even though I don’t know if it makes a difference, I recommend putting one up anyway.
Hopefully, this quick overview of robots.txt will help you to properly take care of your site. If you have any further questions, feel free to leave a comment.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

05 November, 2007

Put Mike Gravel on Democracy NOW!

Mike Gravel is falling back. NBC's decision to kick him out of the debate reeks of corporate censorship by General Electric, one of the companies who profit so much whenever we go to war.
We need to stand up and do something. Mike Gravel needs exposure, and he needs it NOW. What is most astonishing to me about Gravel's lack of progress in the polls is that whenever a blind poll is put out, where people must choose the issues they care about, rather than the candidates behind those issues, Mike Gravel consistently comes out on top, followed closely by Dennis Kucinich and Al Gore. I'll repeat that for those who are just scanning through this paragraph: Mike Gravel LEADS the polls whenever issues are chosen rather than faces. Not only is Mike Gravel the best candidate for President, but the US public seems to know it, too!
Furthermore, if we just let Mike Gravel fade away after what NBC did to him, then what does that say about us as a people? Corporate media cannot be allowed to dictate who can and who can't be our next President! If we let GE throw Gravel out of the race without a vote, Ron Paul will be next! And Kucinich right after that. It's not right--the whole point of having a vote is for people to vote! We MUST allow the people to have their say, rather than to let big media continue to dictate to us who can and who can't be our next President!
There are two easy things that YOU can do to help stop corporate media from pushing us around. The first is to make sure NBC's attempt to throw out Gravel is unsuccessful.
Whether you like Mike Gavel or not is beside the point. What matters is that we DO NOT LET GE WIN. Gravel must be allowed to stay in this race until it comes to a vote; if, at that time, the people decide not to go with him, then so be it. But it must come to a vote!
To accomplish this, Mike Gravel must get the exposure he needs to force the next huge media conglomerate that they must allow him in the debate. So here's what I need all of you who are reading this to do right now.
Go to Democracy NOW! and request Mike Gravel as a guest on their show. Democracy NOW! doesn't have the audience base that NBC does, but the people who watch Democracy NOW! are committed to helping good causes. If we can get Mike Gravel on that show, that could mean a huge increase in volunteer effort! Please, go to Democracy NOW! now!
Second, we must actively engage the public. We need to tap into the mainstream, and get people talking about why GE would do such a thing to Mike Gravel. I have an idea for this, and it involves Stephen Colbert and Ron Paul. But I still need to work out the kinks before I propose it... I'll have a proposal up on this blog by tomorrow.
In the meantime, if you care anything at all about keeping this country free, then go to Democracy NOW! now to help get Mike Gravel on their show!
Update: By request, I am posting the blind poll that shows Gravel on top. Another recent blind poll put Al Gore on top, but in that one, Gravel, Kucinich, and all republicans were not included. Ron Paul has so far not won any blind polls, though he consistently does well in nonblind straw polls.

02 November, 2007

Should You Keep URLs Consistent?

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.


Today’s article is going to be a little controversial, but in a past article I said that was okay, so I’m not worried. The reason today’s article is controversial is because I’m going to talk about URLs. And despite its innocuous name, the differing uses of URLs tend to create huge disagreements in quite knowledgeable people.

URLs are the web addresses you usually see at the top of your browser–it’s basically the pathname of a given internet document. (This article’s url, for example, might be blog.omnistaretools.com/,omnistaretools.com/blog/, or even omnistaretools.com/blog/ 2007/11/02/should-you-keep-urls-consistent/, since this content is served in multiple locations.) The idea behind URLs is that you can use them to reach specific content at any time. As such, the majority of web developers are in near unanimous agreement that once you put up content at a URL, it should stay at that URL.
But there a significant number of web designers who disagree. For reasons of simplicity, an increase in brand recognition, and overall looks, sometimes a designer will make the conscious decision to create a site that has content over multiple pages yet will keep the same apparent URL in the address bar of the browser throughout the entire site.
At this moment, I can guarantee that any web developers out there are groaning at that last paragraph. Yet it is important to remember that web developers and web designers are two very different breeds, even if sometimes you will see developers who also design on the side, and, very occasionally, designers who develop on the side.

The difference between Designers and Developers

Web Designers are the people that determine the look and feel of a website. Their vision is largely artistic. Web Developers are the people that take that vision and create code to put into action. The difference here is subtle, but it is very important to understand.
From the developer’s point of view, the best way to do things is the way they are meant to be done. He is like the engineer who wants to make the bridge functional, maintaining that functionality is what makes it beautiful. See CSS Zen Garden, for example. But for the designer, there is an overarching plan that sometimes goes beyond simple functionality. Sometimes the artist wants to add parts to the bridge that will actually decrease functionality, but in a way that makes it closer to what the designer has in their mind.

The Designer’s Argument

Keeping It Simple & Clean
Simplicity is king in design. Take a design class, and one of the first lessons you will learn is to respect clean whitespace as an integral part of any project. Emptiness can often be more striking than actual content.
In that same vein, designers often do not like the idea that when they design a webpage, they can only affect content within the confines of a browser. Good designers will often use whatever tricks they can come up with in order to break this restriction of being inside the box. This includes using menubar=no, scrollbars=no, and the like for popups, which I may go into in a future article. But it also includes making the address bar look clean, by enforcing it to refer to the homepage only, regardless of what page you are actually on.
How to Do It
Accomplishing this is actually not that difficult. You can mask an entire site to look as though it is from a different domain using functionality that is present in most domain name services, or you can do it manually yourself. Either way, the method is the same. (Omnistar Domains, for example, can do this for you automatically if you choose the ‘mask’ option.) Just create a single frame on your index page that links to your actual content. What follows below is a simplistic example.



Your Page Title


SEO
Of course, the above example code is not good for SEO. To optimize your site for search engines, you must also put up noframes content that mirrors your important content. Although most readers will not view the noframes content (some will, of course, so make absolutely sure you include it for accessibility reasons), search engines will spider this content and tag it as being what you have on your index page. You MUST do this, if you want to take full advantage of the SEO benefits.
What benefits are those, you may ask? Well, since all of your pages have the same web address posted, you can spend all of your time increasing the pagerank for your index page, as opposed to spreading out the links to each of your individual pages. In terms of total ranking, you may lose out, but by putting all your rankings into this one page, you can almost guarantee that this one page will rank higher than it otherwise would have ranked.
Branding
This one is easy to see: by keeping your main page url in the address bar, along with everywhere else on your site, you are enforcing the capacity to remember your site name over time.
Bringing Multiple Source Content Together
Sometimes, if you’re building a test site with a limited budget, it pays to start a website out by bringing together content from another page. By using the single frame but without putting in identical content in the noframes page, you can effectively start to age a site even before you’ve gotten around to starting it. For example, my personal site is still a work in progress, yet instead of putting my domain name in stasis, I mask it over to my blog. This is useful for SEO, as the age of a site is taken into account when determining ranking. Please note that I do not use noframes to mirror content, as having the same content over multiple pages can actually hurt SEO. Only mirror content if the mirrored location is NOT indexed.
CGI
Another interesting use for this masking is to hide ugly cgi addresses. In this case, you really are not losing all that much, since generally cgi generated pages are not as permanent as they could be; especially if you end up using id=’w/e’ or similar function calls. In these cases, you don’t even get good SEO capability with the pages, since search engines tend to extremely disvalue any page whose address uses id=’w/e’ or the like.
Furthermore, such addresses are generally extremely ugly and impossible to remember manually, so there is even further justification for masking the address. Yet even as I say these words, I am confidant that the developers out there are protesting to their utmost, so I’ll move on to their point of view.

The Developer’s Argument

SEO
While it is true that by making all links to your site point to the same page, you are increasing the links to that one page, it is also true that you will get much fewer spontaneous links to content on your site. If you have a good page on widgets, for example, and users find out that if they try to link to it, they instead are putting a link to your main page, which is not the widgets page that they wanted, then they may decide not to link to you at all. This can be a problem, though it is of course minimized if your site is small, and if your best content is on your main page, as opposed to making your main page a portal to your content.
Bookmarks
Again, the issue is that content is not where users expect it to be. When a user bookmarls your site, they expect to be able to go back to what they were looking at by going back to that bookmark. But if the address they bookmarked is in fact a different page, then they may lose interest in your site later on when they return to the bookmark and are confused about where it has taken them.
Developer-hate
One of the strangest effects is that by deciding to go this way with your site, you will effectively upset any web developers who happen to visit your site. They will not consider that you made this decision on purpose, but will instead just think that you do not know what you are doing with web development. This means that for a very small subset of the population, your site will look unprofessional. If you are marketing to a technical audience, then this is a major issue. But if your audience is for people in general, or some other nontechnical group, then I wouldn’t worry about this at all, since from a design standpoint, masking is much cleaner, and so looks more professional to nontechnical people.

In conclusion…

I don’t expect to win over too many developer-oriented adherents, but then that wasn’t the point of this article. As the webmaster of your own site, it really is your site, and you can make the design decisions for your site on your own. Just make sure to take note of all the pros and cons of utilizing this masking method.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

01 November, 2007

The Importance of Using Alt Tags

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.


Using alt tags on images is an important aspect of web design, for accessibility reasons. When designing a site, you should always take alt tags into account, simply because to not do so is to alienate an entire section of your audience.

Accessibility

It is sometimes hard to remember that a significant number of web users will interact with your site using something other than a normal configuration. Blind users often have software that will read out alt text in lieu of showing images; mobile viewers may not be able to view large images, and will instead browse your site according to alt tags; dial-up browsers generally turn off images when browsing unfamiliar sites, just to speed up surfing speed; high-tech users may be using a lynx browser to view your site; and e-mail clients almost always have images turned off as default. In each of these circumstances, the viewer will not see the images you put on your site, but just the alt text. Ignoring alt text would be to ignore the experience of these potential prospects on your site.
Thankfully, there is an easy way of fixing this. When placing an image on your site, just always remember to put in alt text describing what the image is, or what the image is for. That’s it. That’s the fix in its entirety.
(If you want to get technical for a moment, what this means is whenever you place a picture up (), just add in an alt description, as so: description. Of course, in reality you should also be putting in height and width attributes, but that’s a topic for a future article.)
That said, I need to also talk about the part of alt tags that I’ve been consciously omitting until now: alt tags in SEO.

Search engine optimization through alt tags?

In the past, alt tags were used in black hat SEO by stuffing in keywords that could not be seen by normal users into the alt tags. Back then, this was one of a number of techniques used to hide keywords from users while showing them to spiders indexing your site. Instead of a response to this misuse coming from search engines directly, browsers fought back by showing this text to ordinary viewers whenever a user hovered the mouse pointer over the image. But that all changed a couple of years back, when all the major search engines simultaneously decided to revise their ranking algorithms to specifically ignore all alt text content.
I’m going to repeat that to make sure it’s fully understood: Search engines do NOT consider alt text when determining your ranking. This means that keyword-stuffing your alt text is completely pointless. It does nothing to help you, and it does nothing to hurt you, in SEO terms. If all the web were just SEO, then I’d say to just forget about alt tags completely. But accessibility is also important, as if your site is accessible when your competitors’ are not, then that means you’ll get every sale that they lose due to their accessibility. And this is just as important, if not more so, than SEO.
Plus, who’s to say that next year search engines won’t start taking alt tags into consideration again?
A final related tip before I sign off for the day: consider using a description of your graphics beneath each picture. Research has shown that not only do prominent pictures receive a good percentage of users’ attention when they arrive on your site, but also any text right below that image, so long as it is clearly differentiable from the main content of the site (in italics, or small and bold). If you try this, do not eliminate alt text from the picture; but also try not to make the alt text a direct copy of the text below it. Use alt text to further describe the image, whenever possible.
And I really shouldn’t have to say this, but if you use an image that is not meant to be seen by the user (such as a whitespace image or similar), then do not enter in an alt tag. Nothing is more irritating to a blind websurfer than when every bulleted point starts off with the computer reading out “bullet point” in a robotic voice. Well… almost nothing. I guess webmasters who put white text on a white background filled with keywords is more irritating. But not by much, since you can generally just skip over paragraphs full of keywords, but if you want to hear each bullet point, then you’re pretty much forced to hear every instance of “bullet point” spoken aloud.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.