Thursday, April 20, 2006

-- FAQ --

What is BlogLaughs?
The sole purpose of BlogLaughs is to help readers discover some of the most humorous blogs on the Internet.

BlogLaughs is maintained by a magazine editor who compiles the ratings and comments of more than 70 bloggers who volunteer to give their anonymous opinions about a different blog every week.

Can I be one of your reviewers?
Yes. We are always looking for reviewers. Just send an e-mail to and I'll put you on the list.

What is required of BlogLaughs reviewers?
Nothing. I'll send one e-mail each week with a link to the blog we're reviewing. You can choose to respond to the e-mail or not. No pressure. No problems.

If you choose to respond to the weekly e-mail, all you have to do is rate each blog on a scale of 1-10 in a few different categories. An explanation of your rating in a specific category is appreciated by everyone, but not mandatory.

We will gladly give any reviewer a reciprocal link on our blogroll if requested. However, most of our reviewers want to be completely anonymous.

Can you review my blog?
Possibly. However, there are a few things to consider.

Since we only review one blog each week, it may take some time to get around to it.

I'd also like to warn you that some of our reviewers are brutally honest. If they don't like something about your blog they will tell you. Please remember to take these reviews with a big grain of salt.

Can you remove my blog from the review list?
Absolutely. The last thing we want to do is cause anyone any kind of stress. This is all in fun. If your blog is scheduled for an upcoming review, and you do not want your blog reviewed, just let us know and we'll remove it from the list.

Can you explain the scoring process?
Humor content is 70 percent of the score. Design, quality of writing/grammar, intangibles, and frequency are each worth 5 percent. The answer to the question, "Would you read this blog regularly?" is worth 10 percent.

An additional 10 points is also added to every score to help make the numbers line up with a standard grading scale (A, A-, B+, B, etc...). This was done to make the cumulative score more palatable to the first-time visitor.

Some of the blogs at the bottom of your Top-50 list are better than some of the ones at the top. Why is that?
Our scoring system is uniform but it is not consistent because we don't have the same reviewers for each review. Only a handful of people take part every week. Everyone else comes and goes.

Individual opinions are very subjective. If a handful of people really love or hate a particular blog, it can alter the numbers. If those same people don't respond every week -- which is unreasonable -- comparisons are not consistent.

Because of this aspect, we always say our scoring system has a 20-point margin of error. That's far too great to take seriously. We believe the discussion is more important than the rating.

You people are idiots!
That's not really a question, but thanks for visiting.

Look, we are the first to admit Bloglaughs is far from perfect. However, I will put our scoring method against any contest out there.

Rating blogs is very subjective. Please take our reviews with a grain of salt and you'll be OK.

What's so wrong with having ads on my site?
In general, nothing. However, some blog readers have particular pet peeves. Some don't like ads. Some don't like long posts, or extra clicks, or long sidebars, or any number of things.

We round up all of these pet peeves in the "intangibles" category so it can be noted but not become an overwhelming factor in the scoring process.

Why humorous blogs only?
Mainly because we like humor blogs. Many of the sites we have reviewed may also fit into other categories -- entertainment, gossip, political, personal, etc... -- but focusing on the humor content is difficult enough.

How can you rate blogs when your blog sucks?
Let's do this by the numbers ...

We're not a humor blog, but I'd say our content is at least a 6 or 7.

Our design is currently a Blogger template. Most of our reviewers give those a 5 or 6. (We've talked to different designers about a redesign over the past few months, but we procrastinate.)

Our quality of writing is based on the comments of our reviewers. Some are better writers than others. Each review is edited by a professional but he's far from perfect. Mistakes happen. Let's say 7.

The only intangible most of our reviewers mention regarding other sites is length of posts. Some of our posts end up being longer than others. Let's say 9.

Frequency is easy math. Since we're once a week, that's a 2.

"Would you read this blog regularly?" is a difficult question to answer. Our readers are very loyal, but I understand this isn't everyone's bag. Let's say 50 percent, or 5 points.

Add it all up and that gives us a score of somewhere between 76.5 and 68.5.

Of course, that's a subjective score by one person. Your opinion may differ. Feel free to tell us what you think.

What is the most controversial blog you have reviewed?
Hands down, raymitheminx.

Our review of the award-winning Canadian blog was very harsh. Some admitted they might be too old to understand, but most just ripped it because it wasn't the kind of blog they enjoy.

Raymi and her young readers fought back, calling our reviewers "fatties" who probably had "sandwiches thrown at them in high school."

Most of our readers felt those comments were funnier than anything they'd seen at Raymi's blog, but the heated debates raged on for about a week in our comments section.

I admitedly published quotes from our reviewers that went too far. We hope to do a better job in the future. We hope we have learned something from this review.

The entire original post and comments have been left intact to remind all of us what can happen when things to go too far.

Anything else?
When I came up with the scoring system, I thought the final question, "Would you read this blog regularly?" would be one of the most important.

As I've seen this particular number go up and down with no correlation to the ratings in other categories, I began to worry.

However, as time goes on I see this question to represent the actual sample size of the people taking part in each week's review:

A relatively small group of readers -- 45 percent of whom would read this blog regularly --think this particular blog is good, bad, or somewhere in between.

It makes sense that a blog with a low number of possible future readers would score a blog low. It doesn't mean that every possible reader of that blog may feel that way. It just means this particular sampling of readers likes or dislikes a blog. Nothing more. Nothing less..

Updated Feb. 22, 2007


Post a Comment

<< Home