Outlook 2003 ate my mail

Yesterday, my Outlook 2003 hit the 2 GB limit on its PST file, and almost caused me to lose some important email.

To make me feel better, I’m describing the circumstances here. While I doubt anyone else affected will read this before suffering the same fate, they may at least gain some comfort after the fact from knowing they are not alone…

Anyway, the scenario goes like this. Outlook 2003 has a documented 2 GB limit on PST files (i.e. your Inbox and direct sub-folders). Unlike earlier versions of Outlook, which would happily corrupt your PST file with no warning if you exceeded the 2 GB limit, Outlook 2003 now traps this and (eventually) displays an error message saying the file is full.

There are two side effects that are not so obvious. The first is relatively benign: any messages in your outbox remain there after successful transmission, because there was no room to move them to Sent Items. Thus, you mistakenly think they haven’t been sent for some reason and waste time trying to send them again; the recipient often ends up with multiple copies.

The more serious bug, and the one that really annoyed me, is that when Outlook downloads new messages, it realises it has nowhere to store them … and just throws them away. My account is configured to leave messages on the mail server for two days after download, but I expect most users stay with the default account settings, which delete messages as soon as they are downloaded. Such users would likely not even be aware that they had lost mail, since it is irretrievably gone, with no record that it even existed. Certainly, Outlook gives no indication that anything is amiss.

Because this happened to my wife a couple of months ago, I knew to check my mail server’s webmail interface for lost messages, and sure enough, there were several sitting there. When I freed up space on Outlook and downloaded my mail again, it happily ignored those messages, since it considered them already fetched, even though it had discarded them at the time.

Unfortunately, I forgot to check my secondary account’s webmail interface as well, so only became aware of yet more missed messages when someone followed up to see why I hadn’t responded.

Come on Microsoft, how hard would it have been to do this right??

(I’d upgrade to 2007, but I can’t stand the new ribbon strip.)

Vista audio glitch on Sony Vaio V505CP

My personal laptop is a Sony Vaio V505CP. It’s almost five years old, but after upgrading it to 1.5 GB RAM and a 160 GB hard drive, it does a good job of running Windows Vista SP1. I use it mainly when I’m out on consulting jobs or travelling, so I don’t need anything more powerful.

A month or two back, I noticed that whenever I played music, the audio would glitch every 20-30 seconds. The mouse pointer also froze briefly during the glitches. This was something new — it was working fine after the original upgrade to Vista SP1. Today, I finally got around to investigating the cause.

For those in a hurry: the culprit was NDIS.SYS, the network driver. Simply turning off the built-in wireless adapter (easily done with a switch on the front of the Sony’s case) made the glitch go away. I can live with this for now, since I rarely use wireless and stream music at the same time; normally, I leave the wireless enabled all the time, just in case I might need it, but it’s not a big deal to turn it off.

So how did I figure this out? Here are the steps I followed, including dead ends (the journey is often as interesting as the destination).

My first port of call was Windows Task Manager. I usually start here, simply because Task Manager is standard on every Windows PC. In this case, it showed a big CPU spike whenever the glitch occurred, but unfortunately, Task Manager itself froze during the glitch so I couldn’t identify what process was responsible.

The next tool to try was Process Explorer from SysInternals, a wonderful tool for all kinds of system probing. It includes a dummy task entry for DPC (Deferred Procedure Calls) and so I could see that during glitches, DPC was particularly active, responsible for almost all the additional CPU usage.

Deferred Procedure Calls are used whenever a process needs to make a system call, but the system is busy doing something important like handling an interrupt. The system call is queued until the interrupt completes, and all outstanding calls are then executed in sequence — hence the brief jump in CPU activity.

Some Googling brought me to a handy tool called DPC Latency Checker, which graphs DPC usage over time. Using this confirmed the theory:

DPC Latency Checker graph

The red peaks indicate unusually long DPC latencies, which will cause system problems. Unfortunately, the checker didn’t tell me the cause of these, but I knew it was probably a badly written device driver. The checker’s website suggested using Microsoft’s RATTv3 developer tool to identify the culprit.

I hadn’t come across RATTv3 before, but it’s very useful — if you’re running Windows XP, that is. It records the latency of each and every device driver’s interrupt calls, correlated over time, and identifies the bad ones for you. Unfortunately, it doesn’t work too well under Vista.

Back to Google, where I found a thread suggesting Microsoft’s new Performance Toolkit as a good Vista alternative. This does everything RATTv3 can do, and a lot more besides.

Update Dec 2015: this is now part of Microsoft’s Windows Assessment and Deployment Kit (ADK) — it still works on Windows 7, despite Windows 8.1 references.

One thing to watch out for: after installing the toolkit, all the files end up in “C:\Program Files\Microsoft Windows Performance Toolkit”. I suggest copying them to C:\XPerf for convenience, since the installation folder is not added to the system path automatically.

After installing, you can begin a new capture by opening a command prompt in the toolkit folder and typing:

   xperf -on DiagEasy

to begin tracing. Then let the system run for a minute or two, until the glitch has occurred. Next, turn off tracing and convert the results to a file suitable for viewing:

   xperf -d trace.etl

Finally, view the accumulated results:

   xperf trace.etl

Pretty easy! Doing this produced a summary page with a ton of detail, most of which I didn’t need (CPU usage, disk i/o, etc.) The thing I was interested in was DPC activity, which it showed as a graph like this:

XPerf DPC summary display

The peaks in the graph show unexpectedly high DPC levels. By selecting one of those peaks and right-clicking, I was able to display a summary of all the driver activity that had contributed to the peak:

XPerf DPC summary of device activity

From this, it was clear that NDIS.SYS was the culprit, with a worst-case latency of more than 200 ms. (I repeated the test several times to confirm this, and it was consistently in top place.)

So mystery solved. From here, it was an easy step to try disabling the network adapters to see if that fixed things — and turning off the wirless adapter did the trick.

So when I have a chance, I’ll upgrade my wireless driver and hopefully that will sort it out permanently. For now, though, I’m just glad to have properly working audio again.

Telepresence in the air

Marc Andreessen’s blog today mentioned this cool demo:

This uses video goggles with a head-tracking sensor to remotely control the orientation of a camera mounted on a pilot-less plane, letting you virtually explore the heavens.

Apart from the general wow-factor of flying around the sky without ever leaving the ground, it reminded me of another piece of impressive technology I came across recently: quad-copters.

Here, a high-speed DSP is used to combine realtime feedback from gyros and sensors on position, wind direction, etc. to control four rotating blades independently allowing for stationary hovering in a wide range of conditions with no pilot input required. Great for remote video surveillance etc.

Combining these two pieces of technology seems like a perfect opportunity. Has anyone done it yet?

And a missing piece of the puzzle: even using stereo cameras to feed the video goggles, the image will still be flat since there is no way to remotely focus it (other than relying on auto-focus). Has anyone developed a set of video goggles that can track the eye’s ability to focus on specific objects? Combine that with a pair of remote cameras that can track the eye’s focus in that way and you could have REAL telepresence (once the latency isn’t too high, of course).

Isn’t it great that we live in an age where such amazing technology is affordable enough to let people devise interesting hacks in their spare time…?

Turn any surface into a touchscreen

Thanks to Kieran for pointing me towards this impressive Wii Remote hack, covered by Engadget here.

Johnny Chung Lee has done a marvellously simple hack which uses the standard Wii remote controller, plus some ballpoint pens modified to emit infra red, to convert any surface into an interactive touch-screen. With multiple pens, you can support multi-touch effects (as seen on the iPhone and iPod Touch), and previously mentioned on this blog back in March 2006.

Here’s an example of his technique in use:

(Make sure you watch the video long enough to see the technique in action; it’s very impressive, especially when combined with a video projector.)

Johnny’s software to make all this work is free, and available here.

OpenSocial: now facebook is for everyone

Marc Andreessen’s always-interesting blog today talks about OpenSocial, a new standard spear-headed by Google which aims to provide a common API for embedding new web content and apps across all the main social network sites.

This is very similar to the Facebook Platform API launched a few months ago to critical acclaim, but with the significant difference that OpenSocial is an open standard which pretty much everyone else except Facebook is jumping on the bandwagon to support.

Marc’s blog does a much better job of describing the benefits than I can. In a nutshell, though, it means that if you run a website that offers a useful service, you can now easily allow it to be embedded in any of the main social networking sites (LinkedIn, Friendster, Ning, etc.) by just adding some basic HTML & Javascript support.

This should be fun…

(The official Google launch is tomorrow, at which point http://code.google.com/apis/opensocial should become operational.)

Seam Carving & Tiny First Person Shooters

My friend James was impressed by the Tilt-shift photography I mentioned in the previous post, and sent me some related material.

Seam carving is an image resizing technique which works by identifying horizontal and vertical seams with low information content and then removing them, rather than simply removing pixels according to a fixed scaling algorithm.

This means that the proportions of important items within the picture are maintained. The same technique can be adapted to increase the size of an image (especially in a single dimension) without making it look skewed. And more intriguingly, by first marking parts of the image as “low value”, you can seamlessly erase elements of a picture automatically — no Photoshop expertise required.

This YouTube video does a good job of describing it:

Not content with this, James also pointed me towards .kkrieger, a simple 3D shoot-em-up with an impressive twist: the executable size is less than 100 KB. (Yes, that’s Kilobytes). The program would have easily fitted onto a standard 170 KB floppy disk from the Commdore 64 era 25 years ago!

Despite this, the game has pretty decent graphics and sound, not dissimilar to Doom, as this screenshot shows:

Screenshot from .kkrieger

The amazingly small file size is achieved by generating all textures algorithmically at runtime. This leads to long, though not excessive load times.

To download the game or read more about it, visit the main .kkrieger website.

Tilt-Shift Photography

When I had a film camera 15 years ago, I took almost no photographs with it: about one roll of film per year, on average. Then I got a digital camera, and since then I’ve taken a ridiculous number of photos – currently around 36,000 and climbing.

So, while I wouldn’t call myself a big photography buff, I do have a passing interest in photography techniques and methods.

No doubt that’s why my friend Steve sent me a link to this website, which describes Tilt-Shift photography, a style that makes normal scenes look like they are in miniature:

We’re used to looking at photos where everything is in focus (to infinity) so when the depth of field is restricted, the brain is tricked into it’s a model scene. The effect is quite surreal!

Check out the website mentioned above for more information.

Windows Vista select-all bug

I’ve been using Windows Vista for a few months now, and am still finding that for every nice new feature I like, there is a change of behaviour or missing element that I dislike just as much. Ah, progress…

Last night, however, I encountered a new bug which is both minor and infuriating: the ability to select multiple files in Windows Explorer vanished. Using the mouse to drag-out a selection box, holding down Shift or Control, and even trying to choose Select All from the Edit menu are all disabled. It’s hard to describe just how annoying it is not to have this simple capability.

It turns out this is a well known Vista bug, first reported back in the Vista Beta days, and there are three solutions. Two are well-documented, the third is much more difficult to find. Naturally, the third solution was the one that I needed.

For convenience, here they are (in order of simplicity).

1. Go into Tools -> Folder Options -> Views and choose Reset Folders (also available under “Organise -> Folder & Search Options”. You may need to do this from within multiple Explorer Windows before it finally works.

2. Alternatively, run RegEdit and delete all keys under this one:

HKCU\Software\Classes\Local Settings\Software\Microsoft\Windows\Shell

(i.e. Bags, BagsMRU, and MuiCache) in their entirety. Do this with no Windows Explorer windows open; ideally, with Windows Explorer killed in Process Manager before you delete the keys.

3. When neither of the above two methods works, run this FixSingleSelect VBScript and it will sort it out – at least, it did for me.

I’m frankly stunned that Microsoft have (a) allowed a bug with such wide-spread impact to make it through to final release, and (b) not issued a patch to address the problem.

Fixing Windows XP’s sluggish behaviour

ICPUG is one of the oldest computer organisations in the UK, having recently celebrated its 25th anniversary. Almost every year since around 1992, I’ve attended the annual ICPUG computer weekend at the Queens Armes hotel in the village of Charmouth on the Dorset coast.

I’m just back from this year’s event, which was as entertaining as ever (talks ranged from helicopters to pure maths to safe-cracking in Nigeria, and there was even some computing thrown in for good measure). One of the most useful things I came away with, however, was a simple Windows XP that can dramatically improve responsiveness on many systems.

The Start menu on XP has a Documents sub-menu that conveniently lists the last 10 or so documents which have been worked on – very handy if you want to go back and edit a recent file. XP creates this menu from the most recent document shortcuts from the hidden ‘Recent’ folder in your User profile.

However, XP has no mechanism to automatically empty the Recent folder; instead, the more folders, files and documents you open, the more shortcuts accumulate here. On my own system, there were about 1600 shortcuts listed (including many duplicates), dating back to 2004.

Simply emptying out this folder can produce a notable improvement in response speed for things like opening new browser windows, double-clicking document files, and even opening disk folders. I tried it on my system, and the effect was immediate – it felt as fast as a brand new XP installation again.

Because the folder is hidden, the easiest way to get to it is to select Run from the Start menu, then enter:


as the command to run. This will open the Recent folder and you can see how many shortcuts are listed. Then a simple Select All followed by Delete will get rid of them for once and for all.

Credit for this tip must go to Brian Grainger, webmaster of the ICPUG UK site; thanks Brian!

(As a final footnote, the Queens Armes has been sold to new owners as of May 24, 2007, and I believe the name will be changing to Abbotsville or Abbotshead.)

Google Apps / Digital Ethnography

Last night, I attend the monthly meeting of SAGE-IE, the Systems Administrators Guild of Ireland (old website here).

The evening’s talk was on Google Apps, presented by Sam Johnston and Laurent Gasser of Microcost. I had only been peripherally aware of Google Apps, so I figured it would be a good chance to find out some more.

Sam & Laurent both gave engaging and enthusiastic presentations. Microcost is in the business of helping enterprises to move their internal services (e-mail, calendar/scheduler, collaborative document editing, etc.) to Google Apps, with the potential for both large cost savings and significant improvements in productivity.

Some random interesting titbits I took away from the evening:

  • Total cost of upgrading a corporate workstation to Windows Vista is estimated as €2,500 (Microsoft estimate) to €5,000 (independent estimate). This is enough to provide the same user with 50-100 years of Google Apps service. (Google Apps are $50 per user per year for a premium subscription).
  • Microcost use Amazon’s S3 to store enterprise’s back-end data, with another service encrypting the data to/from Amazon (to address any potential privacy concerns). Not clear to me how this interfaces with Google Apps, since this was glossed over.
  • There are significant productivity gains from having proper, shared document editing. When documents always live in the cloud, anyone (with appropriate authorisation) can access them from anywhere, anytime. Multiple users editing the same document can arrive at a final version much more quickly and effectively than the more traditional route of swapping Word and Excel files via email.
  • A big advantage of online apps, such as Google Apps, is that upgrades can happen completely seamlessly without the user having to do anything. Upgrades are small and frequent, rather than large and infrequent. Since everyone using the app is updated simultaneously, there is more scope for making fundamental changes to the underlying code without having to be as concerned with backwards compatibility.
  • One audience member was concerned that organisations could become dependent on certain functionality which might then disappear in a future release, with no control or comeback. Laurent conceded that this was a possibility for individual users, who may grow attached to some particular quirk of the system, but less likely to affect enterprises where Google (or whoever) track user preferences closely.
  • There was also some concern over whether organisations would be willing to move all their data into the cloud. Another audience member commented that larger organisations are already used to giving up control of some or most of their data, by way of internal data centres and outsourced IT support, so they don’t see it as a big leap. For smaller companies, this is a more significant hurdle.
  • Laurent mentioned that in over a year of using Google Apps, he has yet to find any signficant bugs or stability concerns. I think this is key: Google tend to make very reliable and solid web apps, which instill confidence in the user. They have a lot of experience building fault-tolerant systems. If the execution is less than 100%, I expect most users would lose confidence very quickly indeed.

Also, as an aside, Sam mentioned that Trinity College recently announced that they will be moving all student email to Gmail. He expects most other colleges to follow in their footsteps.

The presentation finished off with a look at Mike Wesch’s recent Digital Ethnography video which puts a lot of the Web 2.0 stuff into context. I hadn’t seen this before (though it’s been creating quite a buzz), and it’s well worth watching – download the 67 MB high-resolution version for the best experience.