Symbian vs Windows Smartphone

I often see statements about Symbian being better than Windows Mobile backed up by sales data. While I have no experience with either of the phones (I do have a Windows Mobile-based Audiovox SMT 5600 on order), I wonder how accurate using sales data to backup these claims is.

The reason I question this is that until this month, the largest cellular provider in the US offered just ONE Windows Mobile based smartphone, the Motorola MPx220, which was often panned as being a poor device. Cingular has offered multiple Symbian smartphones over the same period. In fact, as I look around there’s a dearth of Windows Smartphone devices. If you include Pocket PC’s the number goes up, but few would put a $500-$600 dollar Pocket PC in the same category as a $200 dollar smartphone.

The truth of the matter is that there just aren’t that many Windows Mobile smartphones on the market. Most of the Windows Mobile devices are Pocket PC devices with ginormous pricetags. I expect to see a good jump in the sales of Windows Mobile this year and the next. For one, Cingular now offers the Audiovox 5600 which is widely held as being the best Windows Smartphone on the market. Secondly, with the advent of Windows Mobile 5.0 (which is admitedly not a huge jump in functionality, but supports a few things which will increase it’s desirability such as the support for QVGA screens) this year more and more devices will be rolling out with Windows Mobile installed.

Like I said, I have no experience with either of the phone operating systems, but if the tech industry has shown us anything about sales data, it’s that high sales != high quality.

My new phone

Ok, I don’t have a new phone. But I really need one.

My current phone has horrible reception and I want to stab it with a fork almost everytime I use it. Unfortunately, I can’t do that just yet, because the phone I want to get isn’t out on Cingular yet.

The Audiovox SMT 5600 is the phone I want and today Cingular officially launched the phone. Of course it’s only currently available in select markets and apparently the St. Louis market isn’t select enough. Argh!

Our new camera

My old Fujifilm FinePix 2600 has been acting up. Since I’m about to go on a little vacation to Chicago, I decided to go ahead and purchase a new camera.

I had a few requirements when picking out a replacement:

Megapixels
I needed at least 6 MP’s. I’m planning on purchasing the Epson Stylus Photo 2200 which is capable of up to 13″x19″ prints and a person needs a LOT of pixels to print photos that big.

External Flash Capable
While my old Fuji was capable of superb photography, I frequently found myself trying to take pics indoors where it was too dim to get quality results. There are a couple of ways you can use external flashes with cameras that support them. Some cameras have a flash-port which hooks up to compatible flash units via a cable. Other cameras have a hot-shoe that the flash unit attaches to.

Speed
One of the few remaining areas where digital camera’s fall behind their film-based counterparts is in the amount of time that elapses from pushing the button and when the photo is actually recorded. Some newer models of digicams do a lot to reduce this time.

Image Quality
If you believe that more megapixels=more quality, you’re sadly mistaken. There are plenty of 2 and 3 megapixels cameras that take beautiful pictures and plenty of 4 and 5 megapixel cameras that take poor photos.

The perception that megapixels = quality has several sources. One source is that In The Beginning there was a clear delineation between cameras that took high quality photos and those that didn’t … marked by megapixel ratings. This was back in the day of comparing .5 megapixel cameras to 1 megapixel cameras. This difference in quality wasn’t due soley to the amount of pixels captured, but to the fact that low megapixel cameras weren’t meant to take photo-quality shots. They were just toys.

Another source of the megapixels = quality myth is the marketing departments of camera manufacturers. The higher-dollar, higher-profit margin cameras are also the camera’s with more megapixels. This coupled with the fact that many people don’t want to do the research to determine which camera is really the best. It’s easier to just sell people on the one number…megapixels.

Nowadays, the main thing that a megapixel rating is going to tell you is how large you can make your prints. A 2 or 3 megapixel camera is going to make good 4×6 print, and a 5 megapixel is going to make excellent 4×6 prints, great 5×7’s, and pretty good 8×10’s.

I ended up settling on the Canon PowerShot G6. It has a 7.1 MP CCD which allow me to do great 13×19 prints. It has a flash hot-shoe to allow attachment of external flash devices. It has low shot-to-shot times thanks to Canon’s DIGIC image processor. Most importantly it has superb image quality. Many reviews point to the fact that it takes photos that rival those of digital SLR’s costing well over a thousand dollars.

Supposed to be delivered tomorrow. I’ll try to update this space with some sample photos before I leave on Thursday for my little vacation.

Software I’m using

Sometimes I get in the mood to see what kind of cutting-edge software is out there, so recently I started downloading a bunch of different programs to try out. Here’s some interesting stuff….

True Launch Bar
This is an interesting Quick Launch bar enhancement. I haven’t yet decided if it adds enough utility to justify keeping and paying for it.

A couple of nice features include the ability to add menus to the bar. This means you can keep more programs right on the task bar, making them easier to access then drilling through the Start menu. It seems to be quite a time saver if you use lots of different programs as I do.

Additionally, you can put small plugins on the bar to display things like bandwidth, weather, a small calculator (too small for me), a command line, etc. Here’s how mine looks right now:

True Launch Bar

It’s kind of neat, but it’s hard to say if it’ll become something I keep.

Celestia

This is an amazing program. I’ve heard about it several times over the years but have never spent the time to try it out. If you want to be humbled by our universe, definitely download this.

Celestia is a 3d space simulator. It’s not a game, it is an actual model of our universe with a 100,000 stars just in our own galaxy. You can just fly around aimlessly looking at the different planets in our solar system, or download one of the many scripts that take you on a guided tour that really put into perspective the distances involved and the relationships between all the different objects of our solar system that you hear about.

I consider myself quite the science buff yet this program really amazed and educated me. Screenshots don’t really do justice to this program. The educational part is the movement. Moving at many times light speed through the galaxy really shows you how things are related to each other.

If you don’t download this there’s something wrong with your brain.

Cassini with Titan, Sun, and Milky Way in background

I’ll post some more stuff tomorrow.

Moore’s law

Intel recently posted a $10,000 dollar reward for a mint copy of the April 1965 issue of Electronics magazine which was won by a Briton man.

Why was this issue so important? It was the magazine where the co-founder of Intel, Gordon Moore, first stated what is now referred to as Moore’s Law. Moore’s Law has accurately predicted the advancements in semiconductor technology since then.

What exactly is Moore’s Law? Ripped straight from Electronics magazine:

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year … Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer.

What does this mean? Strictly speaking, approximately every 18 months the most complex processor available on the market has double the number of transistors on it as did the most complex processor available 18 months previously. This “law” isn’t so much of a law and more of a prediction based on past data….but it has basically held true since Moore made his prediction. The capabilities of a processor are largely determined by the number of transistors available to it, thus this is an amazing thing to think about. Remember that this is an exponential increase. Think of the increase in capabilities of computers over the past 20 years….the increase in capabilities of computers over the next 20 years will be many times that!

Current computer processors are fabricated with their features at the 130 nanometer and 90 nanometer scales. Companies everywhere are working fervidly to advance to smaller scales such as 50 and 30 nanometers. Within 10-20 years however, we will have approached a serious limit…the size of atoms.

The importance of the size of the transistors is related to how chips are manufactured. There is a limit to the total size of such chips, so the way to increase the number of transistors is to shrink their size. You can’t just add on more transistors without making the chip extremely expensive to manufacture.

Current CPU’s like the Intel Pentium 4 and AMD’s Athlon64 have from 50-100 million transistors. Within a decade processors will have from 10 to 100 times as many transistors depending on exactly what you define as the doubling time….18 months or three years.

There are many ideas about ways to get around the atomic size limit. Moore seems to favor just devising ways to increase chip size.

So is there a limit? Krauss and Starkman at Case Western Reserve University and CERN, have deduced that the computing power of any device in the universe is finite and put a limit on Moore’s Law not being able to continue for more than 600 years. We’ve only used up 40 of those.

Twins

This is just bizzare.

Dell 2005FPW LCD panel

So I finally decided to give an LCD monitor a try after hearing SA goons sing the praises of the Dell 2005FPW and it’s non-widescreen sibling the 2001FP.

Apparently there are some production problems where some of these monitors have backlight leakage but Dell is really good about letting you exchange the monitor until you get one you’re happy with. They also have the same policy towards dead pixels. In fact, I’m going to have to give Dell support 4 out of 5 stars in my dealing with them. Read more »

“Classical holy grail” junk

For the past few days talk has been going fast and furious about an article in The Independent about a new way of translating the Oxyrhynchus Papyri.

Now, in a breakthrough described as the classical equivalent of finding the holy grail, Oxford University scientists have employed infra-red technology to open up the hoard, known as the Oxyrhynchus Papyri, and with it the prospect that hundreds of lost Greek comedies, tragedies and epic poems will soon be revealed.

In the past four days alone, Oxford’s classicists have used it to make a series of astonishing discoveries, including writing by Sophocles, Euripides, Hesiod and other literary giants of the ancient world, lost for millennia. They even believe they are likely to find lost Christian gospels, the originals of which were written around the time of the earliest books of the New Testament.

It sounds amazing!

Unfortunately, it appears something not quite right is going on. Hannibal, of ArsTechnica, speaking from what appears to be a somewhat authritative position lays out some compelling reasons to doubt the claims made in The Independent, or at least reasons to not think the story is so amazing.

Hopefully we get some more concrete information in the weeks to come.