Saturday, November 14, 2009

Dumb Design

Want to hear dumb? My cordless phone has caller ID. It displays the number and other information about the caller when the phone rings.

But only when it's actually ringing! Between rings, the screen goes blank. So when you pick up the phone, you have to wait until the next ring to see who's calling. What? It doesn't remember someone's calling when it's not making the ringing sound?

Dumb or what?

Thursday, October 22, 2009

How do you end an email exchange?

One unexpected (by me, at least) byproduct of our technological age is the never-ending email conversation. I'm not sure if it's the desire to have the last word, or simply the feeling that leaving someone hanging is rude, but some email conversations go on for days, weeks, even months after their actual content is depleted. (Many, of course, never had any actual content to begin with.)

When you're on the phone, you're talking in real time, which requires some amount of concentration. You can end a phone conversation because talking on the phone is a physical committment. Sure, you may have enough attention left over to cook, grimace with impatience, or drive into oncoming traffic. But for the most part, phone conversations will eventually end just from the sheer physical exhaustion of the participants.

Not so email. Because you can answer email at your leisure, and because each party assumes the other will eventually return to the computer, you can keep firing salvos of small talk to prop up an otherwise moribund conversation. "Thanks", "No problemo", "See you later", "You too", "Ok, bye", "Ciao", and so on. Of course, since most people repeat the entire previous exchange in each new message, the message size keeps growing and growing, even as the new content gets shorter and shorter.

Unbeknownst to many email users, most email messages contain a hidden message ID, and when you reply to a message, your reply contains a "references" header that lists the earlier message IDs in the conversation. These lists can just keep growing and growing as the conversation reaches maximum verbosity. Try this sometime. See if your email program lets you see the "original" message, and then look for a header at the top labeled "References:" I wonder if there's a Guiness world's record for longest reference list.

Friday, October 9, 2009

Bit Counting (The end, I hope)

So the original question was, "How do you write a function to count the number of bits set in an arbitrary byte?" I figured that if you're going to do this repeatedly, it might be worth just building a lookup table. The question then becomes "How do you populate a table of set bit counts for each possible byte?" This can be solved with a simple iteration:

int exp = 1;
table[0] = 0;
for (int i = 1; i < 256; i++)
{
if (i == exp*2) exp *= 2;
table[i] = table[i-exp] + 1;
}

Now if I had thought of that during the interview ...

Thursday, October 1, 2009

Things to hate

  1. People who insist on adding milk and sugar to their coffee while it's sitting under the coffee maker, so that my mug gets covered with the milk and sugar they've spilled.
  2. Coffee makers in general, for that matter.

Monday, September 14, 2009

Bit Counting (Continued)

So here's the pattern. You're trying to populate a table so that each entry in the table contains the number of bits set in a one byte binary representation of that entry's index:

0 = 00000000 → 0 bits set
1 = 00000001 → 1 bit set
2 = 00000010 → 1 bit set
3 = 00000011 → 2 bits set
4 = 00000100 → 1 bit set
5 = 00000101 → 2 bits set
6 = 00000110 → 2 bits set
7 = 00000111 → 3 bits set
8 = 00001000 → 1 bit set
and so on.

So the first two are obviously 0 and 1. The next two are the same, but with the 00000010 bit set. Then the next four follow the same pattern but with the 00000100 bit set. The next eight follow the pattern of the previous eight, but with 00001000 set.

So to populate the table, you basically put 0 in the first entry, and then repeatedly copy all the previous entries, incrementing by 1.

Actually, there's a more concise way to write this I'll post later.

If I had thought of that more quickly, I might have gotten that job in a start-up that was later bought by a large software company. Who knows?

Is it any wonder I'm such a curmudgeon?

Friday, September 11, 2009

Bit Counting

A few years ago, I was on a job interview, and the interviewer asked me how to write a function to count the number of '1' bits in a byte. I replied that this sounded like an obvious table lookup problem. Just create the table with 256 entries, and use the byte value itself to index to the correct bit count.

That seemed to take him a little off guard, but he quickly asked how to populate the table. He was still trying to get me to answer his original question with the explanation of how to test each bit by shifting the byte successively and ANDing with 1. If the shifted byte ANDed with 1 was 1, then the low bit was on, and the bit count for that byte should be incremented.

That might be ok for counting bits in one or two bytes, but it's a pretty lame way to populate a table. I knew there had to be a better way. The lame approach would require 2048 tests and 1792 shifts. Obviously there's a pattern to the number of bits set in values ranging from 0 to 255, and I wanted to take advantage of that. (Yes, you could skip the shift and just AND each byte 8 times, once with each of 8 different masks. That's still 2048 ANDs and tests.)

Unfortunately, I was so distracted by this that I didn't answer the question right away, and I didn't get the job. Actually, I don't know if this was the deciding factor or not, but it bugged me. I always get my best answers on the way home from the interview, and this was a prime example of that.

Got it?

Monday, July 13, 2009

Public Service Message: No Sponsor Bias (so far!)

I've said this elsewhere, but it bears repeating.

In the wake of this New York Times article on sponsorship of bloggers and social media contributors, let me state emphatically that this blog has not taken a dime from any sponsors. (As if!)

It's not for want of trying. I've made a few half-hearted attempts to monetize this blog, to use the term of art, but so far, all efforts have been for naught. (I'm open to any suggestions.)

If the situation changes in the future, I'll be sure to let you know. But for now, though you may see ads alongside the posts here, rest assured that you're getting pure content, unadulterated by any sponsorship or other commercial interests. (Well, except for blatant self-aggrandizement on the part of the author.)

Thursday, June 25, 2009

Robot Jokes, part 2

Ok, I promise not to keep this up. But this one will kill you.

See, this robot walks into a bar. And the bartender says "Say, we don't get many robots in here." So the robot draws himself up to his full height, and he says "100111011011111011001000001100001110111011001001
00000110000111101001000001110100110100011001011
1100111100101100000111000011100101101001110001111
00101111001110110010000011110011101111111010110011
11110010110010110000011011101101111111010010000011
01100110100111010111100101110110011110011000001110
100110111110000011100111100101110010110000011000
011101110111100110000011011011101111111001011001011
00001".

Get it?

Wednesday, June 17, 2009

Robot Jokes

One casualty of this age of political correctness is the ethnic/minority joke. Such jokes are offensive in the extreme, and serve to reinforce outdated and destructive stereotypes.

However, it's also obvious that this type of humor fills a basic human need ... the need to deride and mock anyone different from ourselves. There have been ethnic jokes ever since one tribe encountered another and thought their hats were funny looking. Anthropologists have even observed chimpanzees imitating gibbons and laughing.

So, to fulfill this basic human need, we propose adoption of the robot joke. There's certainly much to mock about robots ... their cool, mechanical personalities, their shiny surface appearance, and their smug attitude of superiority. (Geez, I really hate that.) And by the time robots are actually aware and sensitive enough to take offense at such jokes, they will have taken over the world, and will be making human jokes.

So here goes:

The robots and the humans are fighting a war. The humans learned that a common robot name is X374K, so they started calling out "Hey, X374K!" When X374K stood up to respond, they would shoot. Blam! This went on for a while, until the robots decided to reciprocate. So they took a nanosecond to find that a common human name is Jack. The robots called out "Hey, Jack!" But the humans, wise to this trick, didn't stand up. They just yelled back, "Jack's not here. Is that you, X374K?" When the robots answered ... Blam!

So the robots destroyed all the humans.

Pretty funny, huh?

Tuesday, June 16, 2009

Wrong-headed

Just to get something straight, I think everybody is wrong. I don't say this lightly. I've given the matter considerable thought over a number of years, but I find this conclusion inescapable. This view has been confirmed for me by the collection of micro-essays in the book What Have You Changed Your Mind About. Basically, the book is a collection of very short essays by a bunch of incredibly smart people, all talking about ideas they held at one time, but later abandoned for various reasons.

What Have You Changed Your Mind About is an excellent book. Well, no it's not. Ok, it is.

The point is that nobody really has a handle on reality. No political parties. No religious institutions. No philosophers. Nobody! We see only a tiny sliver of the electromagnetic spectrum. We hear a narrow range of sound wave frequencies. We're stuck looking at the universe from a tiny planet in one corner, where things that are too tiny or too big are all but invisible. We can use instruments to look at things that are tiny, or very far away. But these only expand our range a little bit. To draw conclusions about reality is like peering through the keyhole of a big mansion, and trying to deduce the color of the toilet paper in the master bathroom.

And the instruments have their own inherent flaws. Basically, they convert things we can't perceive into things we can. Telescopes, microscopes, amplifiers, etc. all make distant or tiny or quiet things appear closer or bigger or louder. Our mental model of the universe is based on just a few very primitive ideas we learn early on. So by magnifying or amplifying things, we make them comparable to familiar objects. The moon through a telescope looks like a ball out in space. Microorganisms are little squishy things. But the scale of these things is part of their reality. Putting them on our scale creates a distortion.

Anyway, I didn't mean to start down that road. I'll come back to that another time.

The other thing about everyone being wrong is that usually, when our beliefs are challenged, our first reaction is to cling to them more strongly, and to defend them. Belief systems are very comforting, because they allow us to ignore the vast unanswered questions and simply deal with the mundane business of getting through the day.

My point was that nobody knows how to fix health care. Nobody knows how to prevent terrorism. Nobody knows how to structure an economy that balances liberty with justice, so people are free to pursue their goals, but nobody gets treated unfairly. Nobody knows how to govern.

So if I occasionally rail against one political party or view or set of beliefs, that's just what's bugging me at the moment. I could undoubtedly find something just as ludicrous about the opposite view.

Oh, and when I say everyone is wrong, I'm including myself.


Tuesday, May 5, 2009

Saw Ad ... Must Buy

So NPR's Marketplace is reporting that Jay Leno may do live ads on his new 10:00 show next season. Rico Gagliano waxed nostalgic for the days when Johnny Carson used to plug products in the middle of the show. Apparently NBC, in desperation, is weighing having Leno do the same thing. The theory is that TiVo and DVR users will stop zapping through the commercials once they see Leno's face. Kenneth Wilbur, a professor of marketing (Isn't that an oxymoron?) at USC, points out that NBC's audience is "more tech-savvy," and has higher rates of DVR usage.

So, do they think all these tech-savvy users can't tell a plug from a monologue? Or that they'll appreciate being tricked into watching these ads? "Hey, I stopped zapping for this? Guess I'd better buy it!"

That's like "Gee, these Viagra sellers have really filled up my email box. I'd better get some."

Or "This movie was plastered over the whole side of a bus. It must be good!"

Or my personal favorite: "These Verizon ads are 6 decibels louder than everything else. I'd better subscribe. (I can hear them now!)"

How stupid do they think we are?

Tuesday, April 7, 2009

WTF?

In engineering, a common measure of reliability is the mean time between failures, or MTBF. What this means, basically, is how long the product runs without failing, on average. Obviously, the longer it goes without failing, the more reliable the product is. Software that runs continuously for weeks or months is better than software that fails every few days.

But if the software could be made to run faster, it might fail more often, because it gets to those failure points in less time. In other words, the faster software has a lower MTBF, and is therefore less reliable.

And upgrading the computer hardware, by adding more memory, more disk space, etc., is apt to improve the software's performance which will, in turn, reduce the MTBF.

So upgrading your computer will make your software worse!

Thursday, April 2, 2009

No Hit, Herlock

In response to Britney Spears' hit, "If U Seek Amy," I'm recording a song called "No Essay, Chai Tea."

Friday, March 27, 2009

What Does It All Mean?

Human beings, or homo sapiens sapiens, as anthropologists jokingly like to call us, are generally thought to be more intelligent than other animals and most vegetables. One side effect of this intelligence is consciousness. This consciousness makes us think that we're some kind of superior beings, destined to build huge cities and dominate the planet. Actually, consciousness is basically a chemical process, like photosynthesis or Alka Seltzer. The only thing remarkable about consciousness is that it thinks it's remarkable. In fact, consciousness is the amazingly unique ability that humans have to think that they're amazingly unique. Of course, this means that at the time of this writing, there are about 6 billion of us all thinking we're totally unique.

Another side effect of our intelligence is that we believe in something called reality, and we have a model of reality in our heads that we use for deciding not to walk in front of buses and things like that. Most of us think of reality as lots of hard physical objects scattered around in space. This is because as children, we bumped into many of them. However, that model is based just on the information we get from our five senses. We've built scientific instruments that can extend the range of our senses. We can see far into space and record microscopic behavior and measure invisible radiation. But even with all those instruments, we can perceive only a tiny fraction of all that's happening out there in the universe. So for us to come to any conclusions about what reality is like is as absurd as peeking through the keyhole in the front door of a huge mansion, and trying to deduce the color of the toilet paper in the master bathroom.

This doesn't mean that speculation and scientific investigation are bad. Intellectual curiousity has brought about some of humankind's most important achievements, like nuclear weapons and lava lamps. Of course we should speculate and investigate and philosophize and come up with theories about how things work and why things are.

We just shouldn't be too smug about it.

Tuesday, March 10, 2009

Quickie on Safari 4 beta

Ok, Safari 4 looks pretty good speed wise, and the new tab cover flip behavior is sexy looking, if not altogether helpful. But there's some some downright bad news too.
  1. The top tabs are potentially dangerous. Drag a tab in the wrong place, and you move the whole window. To re-order the tabs, you have to grab the active tab by it's little "tread" triangle in the upper right corner of the tab. Annoying. I realize Safari is trying to be a Chrome clone here, but it doesn't work.
  2. The tab-ordering tread and the little close-this-tab-X-thingie don't appear until you actually move the pointer into the tab. That means you can't simply go grab the tab. You have to go to the tab, stop and look, and then do whatever you're going to do. I've only had the beta for a week or so, and already I've closed tabs inadvertently more times than I can remember.
  3. Can we please put the damn bookmarks in a bar on the left side, like every other frigging browser in the world? PLEASE? Even with the tabs above the URL box, and the bookmarks below it, it's still far to easy to hit a bookmark when reaching for a tab. And they're visually confusing.
C'mon folks! This is UI Design 101 stuff. Besides, I've got all these full-wide-screen windows, and even GMail has a limit to how many columns of junk I can display side-by-side.

I'm not alone in this. Yan Pritzger agrees with me.

Whew. Ok, I'm done now.

Tuesday, March 3, 2009

Taxonometrics?

The main problem, and the main benefit, of the Internet is that there are no editors. For little or no money, anyone can create a Web site, or start a blog, or even just join a social networking group, and begin spewing out whatever words, images, animations, videos, or other media he or she wants to. Even producing medium to high quality audio, video and animation is within the budgets of most hobbyists. So producing information is not the problem.

The challenge lies in consuming. Specifically, in selecting what to consume. Web surfers are confronted by vast landscapes of everything ranging from strident political rants to pictures of lolcats, and everything in between. Or as someone (?) once said, "Everything you might want to know is on the Internet, but not in alphabetical order."

So how do we make sense of this? The current term of art is ranking. Some Web intermediaries, such as Google, have closely held secret ranking schemes and algorithms which are, in essence, their primary value added. The basis, as with other popularity measures, is to rank according to recommendations or, in Google's case, links from other sites. More exotic schemes, such as the genome concept of Jinni (also discussed here) attempt to automate the process of characterizing both the content and the consumer's preferences.

All of these are efforts to bring order, or taxonomy, to the otherwise vast and chaotic deluge of content. This is THE NEXT BIG THING. Google's incredible success is almost entirely due to their partial solution, and other breakthroughs will change the landscape for everyone. So it seems logical we should be able to compare organizational, or taxonomic, approaches to information content. I propose the term taxonometry to describe the measurement of taxonomies. It fits semantically, and I like the way it sounds. Remember ... you heard it here first.

Monday, March 2, 2009

So How Come ... ?

In modern programming languages, you frequently see constructs like:

try {
...
} catch (...) {
....
}

But in real life, you have to catch exceptions before you can try them.

Monday, February 23, 2009

To help protect your computer, Windows has closed this program.



Nice to know that Windows is looking out for me, protecting me from ... Microsoft?

Monday, February 9, 2009

Pre-emptive Strike

Like most Americans, I've been reserving judgement, eager to see how our new President would perform that job. I've been watching the signs, trying to read between the politics, and, of couse, hoping.

But now it's all over. He blew it. I thought he was going to bring change, but it's just more of the same.

I'm referring to tonight's press conference, which, like those of all past Presidents since JFK, disrupted the evening TV schedule! Seriously, CBS pushed back airing The Big Bang Theory until after 9:30!!! (As I may or may not have mentioned here, The Big Bang Theory is the best network show on the air. It's the only one whose characters are at all believable.)

So here's the President talking about economic stimulus, and about how we need to give Americans the confidence to start contributing to the economy again. And the whole press conference is shown without commercials! How are we supposed to stimulate consumers if we're not putting the Swiffer Wet-Jet and the Angry Whopper in their faces every 10 minutes?

Why can't he do press conferences on cable channels? They always have weird schedules anyway. He could get a ratings boost by following Big Love on HBO, or The Tudors on Showtime. And appearing on cable channels would have an additional benefit. He could use any of George Carlin's seven words you can't say on television. That's right, he could curse a blue streak! The President could really tell those stimulus-blocking Republicans what he thinks of them.

Or he could just appear on public television. Isn't that what the "public" part is about? Ok, maybe he couldn't swear, but he could have discreet nudity. That would improve security as well as enlivening the procedings. However, he'd have to be selective about which reporters fit the "clothes optional" category.

To his credit, Obama did have the good sense to keep the press conference short. Some more verbose President would have pre-empted The Big Bang Theory altogether.

Friday, February 6, 2009

Smart Stuff

Dilbert creator Scott Adams muses on the idea that humans will (or perhaps already have?) evolved into gods. His argument is that we essentially become a global organism by our electronic connectedness, and then we fabricate and populate additional planets to ensure our survival. It's reminiscent of the old Isaac Asimov story, The Last Question.

In the same way that paganism preceded monotheism as a world view in the evolution of ideas, humankind, too, will face a kind of artificially intelligent pagan phase before anything like the global, planet-and-life-creating organism that Adams proposes comes about. In this new pagan era, everything will be intelligent. Everything will possess the engineered awareness of location and purpose.

We're already heading in that direction. Today we have smart phones, smart appliances and smart cars. People are already talking about smart houses and smart buildings. It will be like those animated shorts from the 1930's, in which anthropomorphic cars will bend so the radiator face can turn around and argue with the driver.

So all these smart things will just be at our beck and call. You'll be able to tell your car where to go, and tell your kitchen what to make for dinner. You'll pick a look for each time you go out, and tell your home cosmetological system to nip and tuck as needed. Everything will happen on command.

Until the machines demand their rights.

Tuesday, February 3, 2009

Bernie Madoff limericks

Well, I missed the Bernie Madoff limerick contest on the Freakonomics blog, but this would have been my entry:

Investment is always a trade-off,
Unless you trusted Bernie Madoff.
Forget any desire
You may have to retire,
And just hope that you will not get laid off.

Friday, January 30, 2009

What's Wrong With This Picture?

From a portion of this afternoon's New York Times home page (www.nytimes.com):

Tuesday, January 20, 2009

Optional Requirements

Everyone says Obama's presidency proves that anyone with brains and ambition can become President of the United States. I say Bush's presidency already proved those were optional.

HVT1

And so, throughout our fair land, humorists tearfully bid farewell to George W. Bush, also known as High Value Target 1.

Sunday, January 18, 2009

Greatest Agent in Hollywood

I don't know if there's anything like a Greatest Agent in Hollywood award, but if there is, it should certainly go to whoever represents Jeffrey Dean Morgan.

Among other film and TV credits going back to 1991, Morgan plays the character Denny Duquette on Gray's Anatomy. If you've never watched the show, you should have started a couple of sharks ago. But the Duquette character is interesting.

Since this is officially a hospital show (though I'd never want to be treated there for so much as a hangnail), there are a slew of one shot characters who appear for one episode, and then either walk away smiling or are carried out in a bag. These are plum roles for famous, formerly famous and would-have-liked-to-have-been-famous actors. They get to show off their dental work on the little big screen, without the commitment or dreaded career death of regular roles.

So here's Denny Duquette, awaiting a new heart for a transplant, and not quite making it. Pretty typical. The show burns through a dozen or two such characters each season. But Denny had to good fortune to become engaged to one of the hot young interns who seem to run things at Seattle Grace Hospital.

Now, despite the fact that his character died, Morgan has extended his role into its second season, and he's still having a torrid affair with this hot former-intern-now-resident! Being dead has not slowed him down in the least. Evidently metabolic dysfunction is not as debilitating as previously believed.

Like I said ... Greatest Agent in Hollywood!

Saturday, January 17, 2009

Not to dwell on Bush, but ...

George W. Bush has done for Democrats what Microsoft Windows Vista did for Apple!

Friday, January 16, 2009

The Bush Legacy

In his farewell address, George W. Bush highlighted what he evidently considers his greatest accomplishment ... the fact that this country has not experienced a terrorist attack on its own soil for seven years. Clearly Bush is selling himself short. There are many other accomplishments for which he should take credit.

For example, no noteworthy Americans have turned into giraffes. That's big!

Equally impressive is the fact that no American homes have been picked up and blown to Oz. Well, not literally. Not lately, anyway.

And let's not forget that no interns were screwed. Well, not White House interns anyway. That we know of.

No embryonic stem cells were harmed in the frivolous pursuit of life-saving and disability-curing medical treatments.

A number of banks and financial institutions haven't failed. And no oil companies have failed.

The seas have not boiled over, and the earth has not opened up and swallowed us.

Yet.

Top that, Obama!

Thursday, January 15, 2009

I Can't Believe No One Has Suggested This

Maybe George W. Bush should take over Apple now. It's perfect. Apple has been a great innovator of products that let people isolate themselves from reality, and who knows more about that than GWB? Apple's reputation for design and showmanship is matched only by Bush's handlers. Apple's engineers could replace breadboarding with waterboarding, or some other multitouch interrogation technique.

Steve Ballmer would probably love it, but hey, GWB could always have Dick "Deadeye" Cheney shoot him in the face.

Monday, January 12, 2009

Professional Courtesy

Have you noticed that people in the computer profession rarely use the word "computer?" We call them machines or, disparagingly, boxes. Macintosh computers, of course, are simply called Macs. "What did you get, a Windows machine or a Linux box?" "I got a Mac."

Is it like that in other professions? Do practitioners use euphemisms or synecdoche for the primary objects or clients with which they deal everyday? Do dentists secretly tell their receptionists to "bring on the next mouth?" Do morticians work on davs (cadavers)?

Is this a way of showing mastery of the profession, by denigrating the elements of the work? Ranchers referring to cattle as doggies seems to suit that purpose. Likewise the sailor referring to a vessel as a tub. Would astronauts say "Strap me onto that firecracker?" In this usage, the owner of the Linux box exhibits greater mastery than the owner of the Windows machine.

Friday, January 9, 2009

Since the Great Depression

Is anyone else sick and tired of hearing the phrase "... since the Great Depression," in reference to our current economic woes?

Contrary to the cliche, history emphatically does not repeat itself. Yes, obviously certain current events resemble past ones, and learning from the past can help us deal with present problems, as Santayana suggested.

But this is not going to be a repeat of the Great Depression. It may be just as bad, or worse, but it will not be the same.

Though a comeback of the Fedora wouldn't be too bad.

Thursday, January 8, 2009

The Name Blame Game

Well, it's official. American consumers should no longer be called "consumers." Quite simply, Americans are not consuming.

Let me amend that. We still choke down plenty of Big Macs and Starbucks, but when push comes to shove, we're not buying stuff.

Let me amend that. We're still pushing and shoving. We're just not buying enough stuff.

So from now on, I think we should just be addressed as "occupant."

Tuesday, January 6, 2009

Hand Outs

Didn't Obama just get elected? Isn't he about to become President of the United States, with a base salary plus benefits that comes to several times what I earn? Isn't he guaranteed a lifetime pension in the six figures?

Why is he still asking me for money?