11 December, 2007

Free Templates Providers

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.

Free Template Providers

So today I came across a great site I wanted to share with all of you: FreeTemplatesProviders.com. They have a number of links up on their site for pages that provide free templates for download. I happen to like this sort of thing, as it certainly makes my day a whole lot easier when I have a good template to work from. (c:
Another site you might want to sneak a glance at for free website templates (if you’re as into this stuff as I am, of course) is FindMyTemplates.com. They have a number of good templates available for download, too.
But most of all, I really want you to check out our free templates. That’s right, we here at Omnistar post new free templates monthly on our site. It’s just another great service we provide.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

07 December, 2007

BREAKING: Google announces war on subdomains

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.


At pubcon, Matt Cutts of Google just announced that they will start limiting search results to two subdomains for any one web search. This is a drastic revision, and may very well affect your site, possibly very negatively. Thankfully, the announcement specifically stated that the changes would be rolled out only in the next couple of weeks, so you still have time to prepare.
What This Means
In the past, search engine results have treated subdomains as though they were different sites. If your company is QWERTY, Inc., for example, and you sell widgets and fizzsnaps, then you might use widgets.qwerty.com as your widgets site, and fizzsnaps.qwerty.com as your fizzsnaps site. In this way, you could keep using the same branding for all your sites by using the same domain, but use different sites for each product.
But now, that logic has been perverted by google’s recent announcement, because now, no matter what the search is, your domain can at most occupy two slots on any search engine result. Which means a search on “qwerty”, for example, would at most show the www subdomain and your most popular other subdomain. Other slots will go to other people’s sites.
Even worse is if you do not own your own domain, but are just renting out a subdomain–now, for any given search, you are almost GUARANTEED not to get into the search results AT ALL.
How To Make Sure Your Site Does Not Take a Hit
If you are currently relying on subdomains for the different departments of your site, now is a good time to start thinking about setting up different domains for each of your departments. Do not abandon your old sites, especially if you have long term seo investment already in them, but start up new sites with unique one-page content on each, that links strongly back to your old domains. Once these pages reach sufficient pagerank, they should be able to replace the positions that you will have lost from Google’s drastic change.
If, however, you unfortunately are hosting your site under a mere subdomain, then this is the call for you to get your own domain. When this new policy rolls into place (Matt Cutts said it would be “in the next couple of weeks”), your site will be effectively dropped from all search engine results, period. The only way to fix this, unfortunately, is to get your own domain.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

26 November, 2007

Are tasers safe?

TASERs have been in the news a lot recently. A student was tasered at John Kerry's speech just a few months ago, and more recently an Oakland cop killed someone with a taser. Then there's the Vancouver guy who was tasered to death in the airport, and in the UK, one man actually burst into flames after he was shot with a taser. He died, too, of course.
Perhaps even more frightening is the Georgia police that unnecessarily tasered a man four times in 40 seconds killing him. Seeing such unnecessary tasering just makes one's stomach churn. Especially when you realize it is everywhere, even when someone is just stopped for a speeding ticket.
The issue, of course, is that because it is considered nonlethal, police seem more apt to utilize it in a situation. But with all these recent deaths, one starts to wonder if it really is nonlethal. That's why the Defence Science and Technology Laboratory (DSTL) in the UK decided to do a study specifically to find out how harmful tasers really are.
Because the cause of death from electric shock generally seems to be a problem with the heart, the researchers focused in on the possibility of cardiac arrhythmia occurring with the use of M26 and X26 commercial tasers. They chose guinea pig hearts to test on, due to the similarity of its electrocardiographic-wave configurations in comparison to human hearts.
The results were surprising for me, though I suppose that's only because I was unfamiliar with similar findings from the past. The current densities of both devices had to be increased by at least a factor of 60 before erratic heartbeats were seen. In other words, tasers are safe when used properly.
There is apparently a wide safety margin between the intensity of a taser strike and the level at which a human heart would beat irregularly. The research team did, however, warn that "consumption of alcohol or some drugs, or an existing heart condition" might "reduce this safety margin in some individuals".
"I have many concerns about TASERs, but the induction of a cardiac arrhythmia appears to be less of a problem," stated Brad Roth, associate professor in the department of physics at Oakland University in Rochester, Michigan. So it remains to be seen why, exactly, so many taser deaths have occurred recently.
Perhaps it just in misusing them that the problems occur? But even if someone is tased four times in forty seconds, how does that compare to the finding that tasers are sixty times too weak to cause irregular heartbeats? These are very unclear questions, and I certainly don't have any answers.
What do you guys think?
Update: PhysicsWorld has an article on today's study, and also I was informed that the UN now considers tasers as torture.

15 November, 2007

19 Planes Virtually Destroyed In US Skies

The administration likes to say that they're doing a great job with homeland security, since no significant attacks have been made on US soil since 9/11. But today that story has been proved to be nothing horseshit.
The GAO released a report today that detailed how teams at 19 different airports were able to smuggle bomb parts and liquid explosives right past security to get onto the planes. These bombs were built out of $150 worth of materials commonly available at hardware stores, and a video they showed in session with congress today showed just how powerful such an explosion would be. (I will upload the cspan footage onto youtube shortly and post it here.) I think it was clear to all who watched that the resulting explosion could easily destroy any plane it went off in, although TSA chief Kip Hawley just had to point out that since such an explosion has not been tested inside a plane, "no one really knows" what would happen if it went off in mid-flight. (I guess they'll have to check Mythbusters for that one.)
While it is clearly troubling that in 19 attempts, 19 successes were had, perhaps it will be even more troubling to realize that when they saw how easy it was to get through, they tried AGAIN, but this time triggered the special screening to see what would happen. They STILL got through in 19 out of 19 tries.
But believe it or not, this is not the worst part.
The worst part is that last year, the GAO did the same exact thing and got the same exact results. There has been no improvement in the intervening year, even though last time, Congress told the TSA to train their people to protect against this threat. TSA chief Hawley insists that "as of right now, our airports are safe. We have fixed the problem", yet immediately afterward, the GAO official (sorry, but I didn't catch his name) retorted that "we used the same exact methods we did last year, and had absolutely no problems getting through", though he admitted he also used new techniques in some cases and additionally smuggled in liquid explosives, which they didn't do in last year's exercise.
What this means, in a nutshell, is that we apparently can't guard against this stuff. This problem was brought to the attention of Homeland Security and its TSA division last year, and were told to at least fix this breach, yet a year later, no progress has been made. It remains obvious, therefore, that no progress can be made, since I refuse to believe that these people actually want security to be so easily broken.
The simple fact, then, is that anytime terrorists want to, they can blow up passenger planes in American airspace. With all that this gov't has tried to do to limit such threats, nothing has worked. The fact that we haven't had a major terrorist attack recently, I then conclude, is not because of hotshots behind the scenes who are taking out the badguys Bond-style, but rather because the would-be terrorists simply haven't yet had the willpower to see such plans through. If they did, then numerous planes would already have been destroyed.
Or do you think I'm being too harsh? What do you guys think?
Update: Summary of the GAO report is available at their website. Also, apparently CNN did a short piece on this as well.

09 November, 2007

What is Ajax?

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.

AJAX is one of those things that every webmaster needs to know about. In today’s e-commerce world, every move toward making a site easier to navigate and interact with is a big plus, and using AJAX is one method of doing just that.
AJAX stands for Asynchronous JavaScript and XML–in english, that just means that xml and javascript are being used on a page to load content asynchronously with the rest of the page. In other words, a webpage that has an ajax component can fully load and remain static even while the ajax component is still processing data between the user and the server.
The beauty of AJAX is that it allows a user to do things without requiring the entire page to reload. This makes it a powerful way of improving the speed and functionality of your site. But, perhaps more importantly, it makes your site seem more professional. As end-users, we are so used to seeing interactivity cause page refreshes that whenever we interact with a page and it doesn’t refresh, that by itself can make it feel professional.
And yet, strangely enough, AJAX is not some complicated new technology. It’s just a new way of processing information, and its power, though perhaps more psychological than anything else (saving page loads isn’t that big a deal, when you stop and think about it) is sufficient to make it a staple of all the new web 2.0 technologies.
Writing AJAX into your site is not difficult if you’re already familiar with javascript. In most cases, a good webmaster will be able to integrate AJAX into a site without too much technical trouble. (A good tutorial on writing AJAX is available here.) The hard part is not writing the code; it’s recognizing which parts of your site should integrate AJAX.
Try to find an interactive part of your site that would not lose functionality by making it just a small box on your main page. For some, this might be a link to subscribe to your newsletter; although if you are marketing a product, you should keep in mind that a ‘subscription confirmed’ page is not fluff, but is an important and viable marketing stategy. Another idea might be to integrate a sales presentation on your site if you have a product to sell, or a virtual home tour, if you are a realtor. Although these functions could also be done in flash, AJAX is preferable because of standards. Nearly everyone has javascript available. Also, flash loads much more slowly, since the entire program has to be uploaded before it can even begin.
As you can see, AJAX is a powerful tool that webmasters can use to increase the speed, functionality, and professionalism of their site. I hope that these ideas will convince you to start working on implemenmting AJAX on your own site. Oh, and if any of you care to share, please post any examples of AJAX you’ve used on your website in the comments. (c;
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

08 November, 2007

Clean and Simple Code for Good SEO

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.

Good SEO often takes a lot of effort, but some of the most important parts of SEO best practices are actually quite simple.

Clean & Simple Code

Making your site validate really isn’t that difficult if you code appropriately. And believe it or not, code that validates is good for SEO. By using semantical markers the way they were meant to be used, search engine spiders will be able to better understand your site, and will then crawl it more efficiently.
But what does this mean, exactly?
First, make sure you have a good webmaster. Don’t use wysiwyg html generators–they never validate. Your webmaster should not only know what validating means, but shold be able to create validated html from scratch.
Second, using semantical markers means to use header tags (

) rather than text that is simply bolded and enlarged. This is fundamentally different from how you might make headers in a word document. It also means to use emphasis and strong tags rather than italics and bold tags.
Third, always link to external files for javascript and css. Remember to try to keep your actual content as the main thing that spiders see when they crawl your page. If they have to wade through lines of javascript and style definitions, it may give up before it even reaches your content.
These three simple rules will help your SEO significantly without requiring too much effort on your part. Remember: clean code is always good.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

06 November, 2007

Using robots.txt for SEO

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.


Robots Exclusion Standard (also known as robots.txt protocol) is the agreement whereby search engines will not read or index certain content on your site, even though it is freely available for the public at large to view. The way it works is that a robots.txt file will instruct search engine spiders on which pages you don’t want it to read, and assuming the search engine is acting in good faith, it won’t crawl those pages. Obviously, this is not a reliable way of hiding data; you must have the cooperation of the search engine for it to work, and even pages that aren’t indexed are still available for viewing by anyone with a web browser. Yet it has its uses.

Blank robots.txt

Before I go into using a robots.txt file, I should mention that to maximize your SEO potential, you should consider allowing everything on your site to be read and indexed. By being fully open, you give yourself more chances to do well in search engine queries. Technically, you can accomplish this by not putting in a robots.txt file at all, but putting a blank robots.txt file will accomplish the same thing and have the added benefit of not creating any errors in your logs. Remember that search engines will look for robots.txt whenever they crawl your site; if it is not there, then this is logged as a 404 error. If you include a blank robots.txt file, these 404 errors will disappear, and you can still rest assured that your entire site will be crawled.
A blank robots.txt must be placed in the root directory of each subdomain, with the following text:

User-agent: *
The asterisk tells the spider that the fllowing rule should be followed by all user-agents (i.e. search engine spiders), and the blank disallow means nothing is disallowed.

What should you disallow?

If you have copyrighted images that you made yourself, and you don’t want people grabbing them off google, you might want to disallow your /images/ folder. If you use cgi, disallowing /cgi-bin/ might be useful, as they don’t tend to do well for SEO anyway. If you have a support page, you may not want that info to show up in web searches. If you are mirroring content on multiple pages, you definitely want to disallow crawling for all but one of those pages. All of these are good examples, and you may come up with many more.

How to Disallow Portions of Your Site

The code is actually quite simple. Just list whichever pages you want disallowed as follows:

User-agent: *
Disallow: /images/
Disallow: /cgi-bin/
Disallow: /dontcrawlme.html

Will adding robots.txt help my SEO?

An unsolved question in SEO circles now is whether or not including a robots.txt file will increase your ranking in the search engines. It’s possible, and some people think they have evidence for it. But the answer is not really clear, as it isn’t the kind of thing that’s important to test intensively enough to tell for sure. After all, it’s not difficult to create a blank robots.txt file, and you certainly can’t lose anything by including it. So even though I don’t know if it makes a difference, I recommend putting one up anyway.
Hopefully, this quick overview of robots.txt will help you to properly take care of your site. If you have any further questions, feel free to leave a comment.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

05 November, 2007

Put Mike Gravel on Democracy NOW!

Mike Gravel is falling back. NBC's decision to kick him out of the debate reeks of corporate censorship by General Electric, one of the companies who profit so much whenever we go to war.
We need to stand up and do something. Mike Gravel needs exposure, and he needs it NOW. What is most astonishing to me about Gravel's lack of progress in the polls is that whenever a blind poll is put out, where people must choose the issues they care about, rather than the candidates behind those issues, Mike Gravel consistently comes out on top, followed closely by Dennis Kucinich and Al Gore. I'll repeat that for those who are just scanning through this paragraph: Mike Gravel LEADS the polls whenever issues are chosen rather than faces. Not only is Mike Gravel the best candidate for President, but the US public seems to know it, too!
Furthermore, if we just let Mike Gravel fade away after what NBC did to him, then what does that say about us as a people? Corporate media cannot be allowed to dictate who can and who can't be our next President! If we let GE throw Gravel out of the race without a vote, Ron Paul will be next! And Kucinich right after that. It's not right--the whole point of having a vote is for people to vote! We MUST allow the people to have their say, rather than to let big media continue to dictate to us who can and who can't be our next President!
There are two easy things that YOU can do to help stop corporate media from pushing us around. The first is to make sure NBC's attempt to throw out Gravel is unsuccessful.
Whether you like Mike Gavel or not is beside the point. What matters is that we DO NOT LET GE WIN. Gravel must be allowed to stay in this race until it comes to a vote; if, at that time, the people decide not to go with him, then so be it. But it must come to a vote!
To accomplish this, Mike Gravel must get the exposure he needs to force the next huge media conglomerate that they must allow him in the debate. So here's what I need all of you who are reading this to do right now.
Go to Democracy NOW! and request Mike Gravel as a guest on their show. Democracy NOW! doesn't have the audience base that NBC does, but the people who watch Democracy NOW! are committed to helping good causes. If we can get Mike Gravel on that show, that could mean a huge increase in volunteer effort! Please, go to Democracy NOW! now!
Second, we must actively engage the public. We need to tap into the mainstream, and get people talking about why GE would do such a thing to Mike Gravel. I have an idea for this, and it involves Stephen Colbert and Ron Paul. But I still need to work out the kinks before I propose it... I'll have a proposal up on this blog by tomorrow.
In the meantime, if you care anything at all about keeping this country free, then go to Democracy NOW! now to help get Mike Gravel on their show!
Update: By request, I am posting the blind poll that shows Gravel on top. Another recent blind poll put Al Gore on top, but in that one, Gravel, Kucinich, and all republicans were not included. Ron Paul has so far not won any blind polls, though he consistently does well in nonblind straw polls.

02 November, 2007

Should You Keep URLs Consistent?

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.

Today’s article is going to be a little controversial, but in a past article I said that was okay, so I’m not worried. The reason today’s article is controversial is because I’m going to talk about URLs. And despite its innocuous name, the differing uses of URLs tend to create huge disagreements in quite knowledgeable people.

URLs are the web addresses you usually see at the top of your browser–it’s basically the pathname of a given internet document. (This article’s url, for example, might be blog.omnistaretools.com/,omnistaretools.com/blog/, or even omnistaretools.com/blog/ 2007/11/02/should-you-keep-urls-consistent/, since this content is served in multiple locations.) The idea behind URLs is that you can use them to reach specific content at any time. As such, the majority of web developers are in near unanimous agreement that once you put up content at a URL, it should stay at that URL.
But there a significant number of web designers who disagree. For reasons of simplicity, an increase in brand recognition, and overall looks, sometimes a designer will make the conscious decision to create a site that has content over multiple pages yet will keep the same apparent URL in the address bar of the browser throughout the entire site.
At this moment, I can guarantee that any web developers out there are groaning at that last paragraph. Yet it is important to remember that web developers and web designers are two very different breeds, even if sometimes you will see developers who also design on the side, and, very occasionally, designers who develop on the side.

The difference between Designers and Developers

Web Designers are the people that determine the look and feel of a website. Their vision is largely artistic. Web Developers are the people that take that vision and create code to put into action. The difference here is subtle, but it is very important to understand.
From the developer’s point of view, the best way to do things is the way they are meant to be done. He is like the engineer who wants to make the bridge functional, maintaining that functionality is what makes it beautiful. See CSS Zen Garden, for example. But for the designer, there is an overarching plan that sometimes goes beyond simple functionality. Sometimes the artist wants to add parts to the bridge that will actually decrease functionality, but in a way that makes it closer to what the designer has in their mind.

The Designer’s Argument

Keeping It Simple & Clean
Simplicity is king in design. Take a design class, and one of the first lessons you will learn is to respect clean whitespace as an integral part of any project. Emptiness can often be more striking than actual content.
In that same vein, designers often do not like the idea that when they design a webpage, they can only affect content within the confines of a browser. Good designers will often use whatever tricks they can come up with in order to break this restriction of being inside the box. This includes using menubar=no, scrollbars=no, and the like for popups, which I may go into in a future article. But it also includes making the address bar look clean, by enforcing it to refer to the homepage only, regardless of what page you are actually on.
How to Do It
Accomplishing this is actually not that difficult. You can mask an entire site to look as though it is from a different domain using functionality that is present in most domain name services, or you can do it manually yourself. Either way, the method is the same. (Omnistar Domains, for example, can do this for you automatically if you choose the ‘mask’ option.) Just create a single frame on your index page that links to your actual content. What follows below is a simplistic example.

Your Page Title

Of course, the above example code is not good for SEO. To optimize your site for search engines, you must also put up noframes content that mirrors your important content. Although most readers will not view the noframes content (some will, of course, so make absolutely sure you include it for accessibility reasons), search engines will spider this content and tag it as being what you have on your index page. You MUST do this, if you want to take full advantage of the SEO benefits.
What benefits are those, you may ask? Well, since all of your pages have the same web address posted, you can spend all of your time increasing the pagerank for your index page, as opposed to spreading out the links to each of your individual pages. In terms of total ranking, you may lose out, but by putting all your rankings into this one page, you can almost guarantee that this one page will rank higher than it otherwise would have ranked.
This one is easy to see: by keeping your main page url in the address bar, along with everywhere else on your site, you are enforcing the capacity to remember your site name over time.
Bringing Multiple Source Content Together
Sometimes, if you’re building a test site with a limited budget, it pays to start a website out by bringing together content from another page. By using the single frame but without putting in identical content in the noframes page, you can effectively start to age a site even before you’ve gotten around to starting it. For example, my personal site is still a work in progress, yet instead of putting my domain name in stasis, I mask it over to my blog. This is useful for SEO, as the age of a site is taken into account when determining ranking. Please note that I do not use noframes to mirror content, as having the same content over multiple pages can actually hurt SEO. Only mirror content if the mirrored location is NOT indexed.
Another interesting use for this masking is to hide ugly cgi addresses. In this case, you really are not losing all that much, since generally cgi generated pages are not as permanent as they could be; especially if you end up using id=’w/e’ or similar function calls. In these cases, you don’t even get good SEO capability with the pages, since search engines tend to extremely disvalue any page whose address uses id=’w/e’ or the like.
Furthermore, such addresses are generally extremely ugly and impossible to remember manually, so there is even further justification for masking the address. Yet even as I say these words, I am confidant that the developers out there are protesting to their utmost, so I’ll move on to their point of view.

The Developer’s Argument

While it is true that by making all links to your site point to the same page, you are increasing the links to that one page, it is also true that you will get much fewer spontaneous links to content on your site. If you have a good page on widgets, for example, and users find out that if they try to link to it, they instead are putting a link to your main page, which is not the widgets page that they wanted, then they may decide not to link to you at all. This can be a problem, though it is of course minimized if your site is small, and if your best content is on your main page, as opposed to making your main page a portal to your content.
Again, the issue is that content is not where users expect it to be. When a user bookmarls your site, they expect to be able to go back to what they were looking at by going back to that bookmark. But if the address they bookmarked is in fact a different page, then they may lose interest in your site later on when they return to the bookmark and are confused about where it has taken them.
One of the strangest effects is that by deciding to go this way with your site, you will effectively upset any web developers who happen to visit your site. They will not consider that you made this decision on purpose, but will instead just think that you do not know what you are doing with web development. This means that for a very small subset of the population, your site will look unprofessional. If you are marketing to a technical audience, then this is a major issue. But if your audience is for people in general, or some other nontechnical group, then I wouldn’t worry about this at all, since from a design standpoint, masking is much cleaner, and so looks more professional to nontechnical people.

In conclusion…

I don’t expect to win over too many developer-oriented adherents, but then that wasn’t the point of this article. As the webmaster of your own site, it really is your site, and you can make the design decisions for your site on your own. Just make sure to take note of all the pros and cons of utilizing this masking method.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

01 November, 2007

The Importance of Using Alt Tags

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.

Using alt tags on images is an important aspect of web design, for accessibility reasons. When designing a site, you should always take alt tags into account, simply because to not do so is to alienate an entire section of your audience.


It is sometimes hard to remember that a significant number of web users will interact with your site using something other than a normal configuration. Blind users often have software that will read out alt text in lieu of showing images; mobile viewers may not be able to view large images, and will instead browse your site according to alt tags; dial-up browsers generally turn off images when browsing unfamiliar sites, just to speed up surfing speed; high-tech users may be using a lynx browser to view your site; and e-mail clients almost always have images turned off as default. In each of these circumstances, the viewer will not see the images you put on your site, but just the alt text. Ignoring alt text would be to ignore the experience of these potential prospects on your site.
Thankfully, there is an easy way of fixing this. When placing an image on your site, just always remember to put in alt text describing what the image is, or what the image is for. That’s it. That’s the fix in its entirety.
(If you want to get technical for a moment, what this means is whenever you place a picture up (), just add in an alt description, as so: description. Of course, in reality you should also be putting in height and width attributes, but that’s a topic for a future article.)
That said, I need to also talk about the part of alt tags that I’ve been consciously omitting until now: alt tags in SEO.

Search engine optimization through alt tags?

In the past, alt tags were used in black hat SEO by stuffing in keywords that could not be seen by normal users into the alt tags. Back then, this was one of a number of techniques used to hide keywords from users while showing them to spiders indexing your site. Instead of a response to this misuse coming from search engines directly, browsers fought back by showing this text to ordinary viewers whenever a user hovered the mouse pointer over the image. But that all changed a couple of years back, when all the major search engines simultaneously decided to revise their ranking algorithms to specifically ignore all alt text content.
I’m going to repeat that to make sure it’s fully understood: Search engines do NOT consider alt text when determining your ranking. This means that keyword-stuffing your alt text is completely pointless. It does nothing to help you, and it does nothing to hurt you, in SEO terms. If all the web were just SEO, then I’d say to just forget about alt tags completely. But accessibility is also important, as if your site is accessible when your competitors’ are not, then that means you’ll get every sale that they lose due to their accessibility. And this is just as important, if not more so, than SEO.
Plus, who’s to say that next year search engines won’t start taking alt tags into consideration again?
A final related tip before I sign off for the day: consider using a description of your graphics beneath each picture. Research has shown that not only do prominent pictures receive a good percentage of users’ attention when they arrive on your site, but also any text right below that image, so long as it is clearly differentiable from the main content of the site (in italics, or small and bold). If you try this, do not eliminate alt text from the picture; but also try not to make the alt text a direct copy of the text below it. Use alt text to further describe the image, whenever possible.
And I really shouldn’t have to say this, but if you use an image that is not meant to be seen by the user (such as a whitespace image or similar), then do not enter in an alt tag. Nothing is more irritating to a blind websurfer than when every bulleted point starts off with the computer reading out “bullet point” in a robotic voice. Well… almost nothing. I guess webmasters who put white text on a white background filled with keywords is more irritating. But not by much, since you can generally just skip over paragraphs full of keywords, but if you want to hear each bullet point, then you’re pretty much forced to hear every instance of “bullet point” spoken aloud.
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

31 October, 2007

Starting to Think About Ron Paul

I've been avoiding talking about Ron Paul for a while now. But I don't think I can continue without saying a few things.

There's something about this Ron Paul guy.

And I'm not just talking about the fact that he is currently third in raising funds. I'm not just talking about the fact that he wins straw polls left and right, often by a huge margin. (81% in my home state of Alabama (2nd place was 5%), and 28% in my current state of Maryland (2nd place was 24%), to name just two examples).
I guess what I'm talking about is the fact that his message seems to resonate with so many people.

Let me say up front that I am a Mike Gravel supporter. But I want to give an honest rundown of Ron Paul, so that everyone can see what I see whenever I hear about him, which seems to happen more and more often lately.

The one thing that shines through about Ron Paul is his sincerity. He has very strong opinions, and his votes on the hill never deviate from them. One thing that he is very strong about is his absolute refusal to vote on any measure that he believes the constitution does not authorize the congress to oversee, even if he is in fact in favor of hatever that measure is trying to address. The man is a constitutionalist, through and through, and he has stood up against special interests and lobbyists of all kinds throughout his tenure. In fact, it is so well known that he remains true to his core beliefs that few lobbyists even bothered to try with him, at least up until his presidential campaign got up to speed.
Many times, looking at congressional voting records is difficult, as there are almost always extenuating circumstances with different bills. But not so with Ron Paul; his voting record is clean as the driven snow.
He has NEVER:
  • voted to raise taxes
  • voted for a budget that wasn't balanced
  • voted to raise congressional pay
  • voted for a bill that he believed was contrary to the constution, including any and all bills restricting gun ownership of any kind, and any bill that would have granted the executive power additional powers

  • In addition, he voted against the Patriot Act.
  • He voted against regulating the internet.
  • He voted against the Iraq war.
Furthermore, he refuses to accept anything more than a minimal salary; he does not participate in the congressional pension plan, and he returns a portion of his annual congressional office budget to the US treasury every year.
Furthermore, he is one of the most active congressmen I've ever seen. When compared to any other single member of congress, he has introduced the most pieces of legislation, though to be honest, not many are able to get through, due to his unwillingness to work 'behind the scenes' and participate in quid pro quo policies.

In short, Ron Paul is honest, decent, and truthfully believes in his convictions. It is rare to see such a person running for President. It is even rarer to see them actually starting to do really well, as Dr. Paul is.

His foriegn policy is very similar to Mike Gravel's. He wants to pull our troops out, not just from Iraq, but from most everywhere else, too. To quote Mike Gravel on a point that Ron Paul agreed with: "Policing the world is just their way of enforcing our American empire."
Ron Paul's ideas on free trade are quite confusing. He is libertarian, yet... Ron Paul believes that free trade deals and world governmental trade organizations like ICC, NAFTA, GATT, WTO, and CAFTA are bad for our nation. To quote Dr. Paul: "We must withdraw from any organizations and trade deals that infringe upon the freedom and independence of the United States of America."
Ron Paul also strongly believes in privacy, even though he does not agree that constitution has a privacy clause in it. He's against a national ID card, against the use of a social security number in the private sector (he wants it to be gov't only), against letting medical insurance companies see your medical info, against the current US law that notifies the US gov't wheneveryou deposit $10k or more into a bank, etc.
He is also strongly against birthright citizenship. "As long as illegal immigrants know their children born here will be citizens, the incentive to enter the U.S. illegally will remain strong." And Dr. Paul believes that removing incentives is the only way to properly enforce laws.
He is for what he calls 'health freedom', which he describes is your ability to take care ofyour health the way you best see fit. This includes a move against the FDA, possibly to the point of shutting them down, and an opposition of any bill that would require US citizens to be immunized from any disease, such as the recent HR 5005 that authorized the forced vaccination of American citizens against small pox.
He is very strongly pro-life, as he was a practicing medical doctor for many years, and delivered many babies himself.
He is even more stringently against taxation, wanting to minimize taxes to the extreme. He wants to abolish the IRS and replace it with NOTHING. He points out that the loss of income tax will bring our national income level to about where it was in 2000. He thinks that cutting spending to that level shouldn't be that difficult. He also wants to start backing every dollar with something physical, so that the US gov't has to stop taking loans out by printing more money. He is an extreme fiscal conservative.
He wants to eliminate social security for all younger people, and yet still pay out top dollar to those depending on it, as well as actually increasing what they receive, because he believes that all taxes on social security income should be immediately repealed. This would be paid for by reducing the military.

I could go on and on. But you get the idea. His issues are pretty straightforward, and he believes in them all fairly strongly.

There are a number of things that I disagree with. But the weird part is... If he were president, most of this stuff he'd never be able to implement. And the stuff that he would be able to implement is stuff I basically agree with.
So, in a weird way, I almost wouldn't mind if he were president.

Please don't get me wrong; I support Mike Gravel. And I think removing the FDA is just as ludicrous as his consistent voting record to allow citizens to legally handle submachine guns. But I have to hand to it to him that at least he's consistent. And when you think about it, only congress has the authority to deal with gun legislation. Only congress has the authority to deal with birthright citizenship. Only congress has the authority to back the dollar. And since Ron Paul is so sincere, I cannot imagine that he would use the executive branch to subvert his own ideals in order to bring those policies into place. No, if he were president, these insane ideas that I wholeheartedly think are stupid would never come into play at all.
On the other hand, everything that I do agree with is stuff that he would then have authority to start realizing: foreign policy, trade agreements, int'l organizations, etc. The only thing, in fact, that he would have power over that I don't immediately like is the possibility of his appointing new supreme court justices. But Ron Paul seems so honest... I can't imagine him appointing anyone that wasn't a strict conservative constitutionalist. And those types of people would never repeal earlier decisions by their own court without overriding need, and so they would not vote to overturn Roe v Wade, for example.

So with Mike Gravel getting beaten back left and right, and with Dennis Kucinich saying incredibly stupid things like "I saw a UFO" on live national television, I'm really starting to think hard about Ron Paul.
And I'm not sure what I'm going to do yet.

30 October, 2007

Kucinich: "I saw a UFO."

Kucinich: "I saw a UFO."
.:sigh:. Look: I'm for Mike Gravel, and I back him 95%. But if he doesn't make it, the only choice I have left is Dennis Kucinich. So it really irks me that he's so idiotic as to say something like this during the democratic presidential debate.
Why can't truly intelligent people ever make it in US politics? Why is it that of all the contenders that are left (if Gravel's run is over), the last person I feel capable of voting for is someone who claims to have seen a UFO?
I really want to move to another country. )c:

Watch Gravel at NBC Presidential Debate

Mike Gravel, who was banned from the tonight's democratic presidential debate by NBC is going to answer all the debate questions from across the street, and broadcast this live on the web at Gravel2008.us.
If you plan to watch tonight's debate, then you should also watch Gravel's webcast! If we let GE and its subsidiary, NBC, dictate who is and who is not allowed to become president, then it is not Mike Gravel that loses out, but all of democracy itself! How can we claim to have a free society when this kind of thing occurs, and no one acts to stop it? Why are we not revolting in the streets right now, as they would be doing in any respectable developing country? We cannot--we dare not allow the giant news corporations to dictate to us who is allowed airtime on these presidential debates! If we want to call ourselves free, then we must push for freedom!
Watch the debate live tonight, at Gravel2008.us! (6pm pst/9pm est)
Update: Now that the debate is over, here's the video of the event. Be aware that the audio was messed up in the first two minutes, but it quickly clears up.

What is Digg?

This entry was originally posted on the omnistaretools.com blog. It is reposted here for reference only.

SEO Book
Web 2.0 is all about socialization. Whereas media may have ruled in the 1.0 era, now that 2.0 is here, social media gets all the attention. Even the old guard has brought web 2.0 to their sites: major newspapers like the New York Times and The Guardian have blog comments/forums where users can give feedback, and major television news corporations like Fox News actively requests and airs user generated content, such as video of the recent California fires, or quick comments sent off to The O’Reilly Factor.
But the newest Web 2.0 trend is social bookmarking. Whereas in the old web 1.0 days, setting a bookmark meant using your browser to list a site as your favorite, and then finding that link again meant you had to go back to that same browser on that same computer, now setting a bookmark on your social bookmarking site of choice creates an online link that is not only for your reference, but is open to the public at large to see what you found that was interesting enough to bookmark.
Web 2.0 bookmarks are shared among thousands of viewers, and something as simple as setting a bookmark can now mean that that site will receive thousands of hits within the space of a few days.

What is Digg?

The most popular social bookmarking site by far is Digg. Ostensibly a bookmarking site that’s concentrated on technology news, in reality, Digg is the bookmarking site of choice for the masses. If it’s popular on the web, then you can bet it’s either already on Digg or else it was made popular well beforeDigg‘s occurrence on the scene in 2004.
Digg works like this: someone finds a website or page they find interesting, and they submit it on Digg. Then others browsing Digg see the submission, and, if they also think it is interesting, they bookmark it as well. On Digg, the act of bookmarking is called ‘digging’, and when a site has a number of people who have bookmarked it, then one says that it has been ‘dugg’ that many times.
This is important for you as a webmaster, because when you put up useful content, it is always good to find some way of getting that content to be seen by many viewers. By making it easy for content on your site to be dugg, then you will start to get a lot of new visitors that you otherwise would not otherwise have had. It should be mentioned that, by far, the majority of these visitors will not be high quality leads, but the sheer number of visits you may receive from a popular article on Digg will certainly drive a number of conversions.
When putting up new content, try placing a button next to your article that allows readers to submit to digg.com in a single click. (Some example buttons for this use are available at Digg.com.) If you want, you might try submitting your own articles to get started, but make sure that you only do this for articles that you feel are high-quality enough to not be interpreted as spam. Make sure that when you submit your article, you put in a good description and place it in the correct category; once submitted, these options cannot be modified.

A few caveats…

Having said all of this, I want to make sure that everyone understands that the real reason why digg is useful is for SEO. It is in getting a pagerank 7 or 8 link to your content that Digg really shines. Yes, a popular story may get you more visits in one day than your site usually gets in six months. But these visits are usually by people browsing the general archive, and are not high-quality visitors. Their conversion rate will be far less than what you normally receive. Nevertheless, the high pagerank link makes it all worth it.
A final caveat is that you should remember that each webpage has its own pagerank–your site as a whole does not share pagerank with itself, though if you do extensive internal linking, it will always help. Thus when you get a story dugg, remember that the url of the page that is dugg is the only page on your site to gain that high pagerank link. Nevertheless, do not try digging the main page of your site until you feel truly comfortable with the system, as editing the digg links once posted are impossible, and the system does not allow multiple links to the same url.
Hopefully, this information has gotten you up to speed on at least one web 2.0 site that can help your SEO. In future articles, I will be covering many more.

Digg this story!
Posted by Eric Herboso.
Did you enjoy this article? If so, then subscribe to my RSS feed.
There are more resources available at our On-line Webmaster Resource Center.

26 October, 2007

NBC is Offered $1,000,000 if They Let Gravel Speak

Recently, NBC decided to screw Mike Gravel by setting up arbitrary requirements specifically designed to bar Gravel from participating in the upcoming 30 October democratic presidential debate.
Well, now one of Gravel's supporters (who is apparently also a Ron Paul supporter) has decided to stand up and put his money where his mouth is. Gregory Chase, multi-millionaire, sent this letter to five executives at NBC, DNC chairman Howard Dean, the President of Drexel University, and also published it as an advertisement in four newspapers. Here is an excerpt:
If it would help get Senator Gravel back into the debate, I offer to purchase $1 million of advertising from NBC, or simply pay NBC $1 million in exchange for the service of allowing Senator Gravel to participate in your debate.
But he doesn't stop there. He also posted this youtube video, where he personally offers $25,000 to whomever produces the Gravel youtube video with the most views through the end of the year.
Furthermore, Mr. Chase has personally purchased advertising space for every day from now until the end of the year in the three major New Hampshire newspapers: the Monitor, the Manchester Union Leader and the Nashua Telegraph.
Stay tuned for additional updates as this story moves forward.
Update: Jon Kraus interviewed Gregory Chase, and Bryan Bissell reported it on his blog.
Second Update: From Jon Kraus's interview:
Chase said that he had spoken with an election lawyer who advised him to make dependent expenditures on behalf of Gravel, where his name would have to be attached to any advertisements he made, along with a notation that it was not involved with the campaign. Furthermore, Chase mentioned that he has never met Gravel, and for that matter has never been to a political rally in New Hampshire.
Kraus also mentioned that Gregory Chase is also a supporter of Ron Paul in the interview. I amended the copy above to reflect this.