All posts by eddy

Fixing Windows XP’s sluggish behaviour

ICPUG is one of the oldest computer organisations in the UK, having recently celebrated its 25th anniversary. Almost every year since around 1992, I’ve attended the annual ICPUG computer weekend at the Queens Armes hotel in the village of Charmouth on the Dorset coast.

I’m just back from this year’s event, which was as entertaining as ever (talks ranged from helicopters to pure maths to safe-cracking in Nigeria, and there was even some computing thrown in for good measure). One of the most useful things I came away with, however, was a simple Windows XP that can dramatically improve responsiveness on many systems.

The Start menu on XP has a Documents sub-menu that conveniently lists the last 10 or so documents which have been worked on – very handy if you want to go back and edit a recent file. XP creates this menu from the most recent document shortcuts from the hidden ‘Recent’ folder in your User profile.

However, XP has no mechanism to automatically empty the Recent folder; instead, the more folders, files and documents you open, the more shortcuts accumulate here. On my own system, there were about 1600 shortcuts listed (including many duplicates), dating back to 2004.

Simply emptying out this folder can produce a notable improvement in response speed for things like opening new browser windows, double-clicking document files, and even opening disk folders. I tried it on my system, and the effect was immediate – it felt as fast as a brand new XP installation again.

Because the folder is hidden, the easiest way to get to it is to select Run from the Start menu, then enter:

%HOMEPATH%\Recent

as the command to run. This will open the Recent folder and you can see how many shortcuts are listed. Then a simple Select All followed by Delete will get rid of them for once and for all.

Credit for this tip must go to Brian Grainger, webmaster of the ICPUG UK site; thanks Brian!

(As a final footnote, the Queens Armes has been sold to new owners as of May 24, 2007, and I believe the name will be changing to Abbotsville or Abbotshead.)

Google Apps / Digital Ethnography

Last night, I attend the monthly meeting of SAGE-IE, the Systems Administrators Guild of Ireland (old website here).

The evening’s talk was on Google Apps, presented by Sam Johnston and Laurent Gasser of Microcost. I had only been peripherally aware of Google Apps, so I figured it would be a good chance to find out some more.

Sam & Laurent both gave engaging and enthusiastic presentations. Microcost is in the business of helping enterprises to move their internal services (e-mail, calendar/scheduler, collaborative document editing, etc.) to Google Apps, with the potential for both large cost savings and significant improvements in productivity.

Some random interesting titbits I took away from the evening:

  • Total cost of upgrading a corporate workstation to Windows Vista is estimated as €2,500 (Microsoft estimate) to €5,000 (independent estimate). This is enough to provide the same user with 50-100 years of Google Apps service. (Google Apps are $50 per user per year for a premium subscription).
  • Microcost use Amazon’s S3 to store enterprise’s back-end data, with another service encrypting the data to/from Amazon (to address any potential privacy concerns). Not clear to me how this interfaces with Google Apps, since this was glossed over.
  • There are significant productivity gains from having proper, shared document editing. When documents always live in the cloud, anyone (with appropriate authorisation) can access them from anywhere, anytime. Multiple users editing the same document can arrive at a final version much more quickly and effectively than the more traditional route of swapping Word and Excel files via email.
  • A big advantage of online apps, such as Google Apps, is that upgrades can happen completely seamlessly without the user having to do anything. Upgrades are small and frequent, rather than large and infrequent. Since everyone using the app is updated simultaneously, there is more scope for making fundamental changes to the underlying code without having to be as concerned with backwards compatibility.
  • One audience member was concerned that organisations could become dependent on certain functionality which might then disappear in a future release, with no control or comeback. Laurent conceded that this was a possibility for individual users, who may grow attached to some particular quirk of the system, but less likely to affect enterprises where Google (or whoever) track user preferences closely.
  • There was also some concern over whether organisations would be willing to move all their data into the cloud. Another audience member commented that larger organisations are already used to giving up control of some or most of their data, by way of internal data centres and outsourced IT support, so they don’t see it as a big leap. For smaller companies, this is a more significant hurdle.
  • Laurent mentioned that in over a year of using Google Apps, he has yet to find any signficant bugs or stability concerns. I think this is key: Google tend to make very reliable and solid web apps, which instill confidence in the user. They have a lot of experience building fault-tolerant systems. If the execution is less than 100%, I expect most users would lose confidence very quickly indeed.

Also, as an aside, Sam mentioned that Trinity College recently announced that they will be moving all student email to Gmail. He expects most other colleges to follow in their footsteps.

The presentation finished off with a look at Mike Wesch’s recent Digital Ethnography video which puts a lot of the Web 2.0 stuff into context. I hadn’t seen this before (though it’s been creating quite a buzz), and it’s well worth watching – download the 67 MB high-resolution version for the best experience.

Server room on a chip

Jon Stokes from ArsTechnica has a good article on Intel’s recently announced Terascale 80-core processor.

When you have so much processing power available, how do you make best use of it? One way is to treat it as a virtual server room, and run virtual machines on each core. For example, a heavily trafficed website, which is traditionally spread across multiple web & database servers, could be hosted completely on a single piece of silicon, with corresponding cost and power savings (especially power).

Then there’s the problem of how to keep such a fast chip adequately supplied with data, to ensure it doesn’t spend too much idle time waiting for new packets to arrive. There are some hints that Intel may be about to announce on-chip optical support.

I doubt we’ll see this technology on the desktop any time soon (though you never know), but in an era where datacentres are routinely sucking up megawatts of power, it’s useful to have a potential glimpse of a future where the entire room may be reduced to a single server cabinet.

(Let’s ignore storage, for now…)

Compaq’s Problematic ProLiant

For the past month, I’ve been upgrading the hard disk on a friend’s Compaq ProLiant ML-370 server (first generation). I finally completed the job on Friday.

Why so long? Surely a hard drive upgrade is a five minute job? Well usually, yes. However, Compaq have gone to some trouble to make it difficult.

The old server had two 18 GB UltraSCSI drives, and had run out of space. The plan was to add a cheap, large IDE drive to provide additional data storage, while retaining the SCSI drives for Windows 2000 and Microsoft Exchange Server. Thus, a 160 GB Maxtor IDE drive was purchased.

Compaq's ProLiant serverFirst problem: the ProLiant has an internal IDE interface, but it does not support hard drives! The BIOS actually says this when you try to configure it – or it would if there was a BIOS, which there isn’t. Instead, you have to download a four floppy disk System Configuration Utility from Compaq’s website. In this day and age, even finding four working floppy disks (not to mention a working floppy drive, in a server that has been actively sucking in dust for almost five years) is a challenge in itself.

(Later on, I discovered you can also download the SmartStart 5.50 CD to do the same thing; there are later versions available but they don’t support the ML-370 G1).

Anyway, ignoring what the BIOS said, I connected the IDE drive to the internal IDE interface and booted up. Windows found the drive okay, decided to limit it to 137 GB capacity (despite running SP4, and enabling the enhancement to support 48-bit LBA addressing). No matter, we could live with 137 GB.

What we couldn’t live with was the speed – appallingly slow, and with 100% CPU utilisation whenever the drive was accessed. A quick visit to Device Manager confirmed that the interface was running in PIO mode – Programmed I/O – rather than using UltraDMA which is much, much faster. Apparently this is one of the limitations of the Compaq IDE interface (and probably the reason why they don’t support IDE drives).

Next bright idea: add an internal PCI IDE card. A cheap ITE 8212 non-RAID card was duly ordered and installed, and the card’s onboard BIOS had no problem detecting all 160 GB of the drive — great! “Non-system disk or disk error” — not so great!

I couldn’t find any way to tell the IDE card’s BIOS (or the Compaq System Configuration Utility) to avoid trying to boot off the attached IDE drive. There is a boot priority order, but this ignores plug-in IDE cards and only pays attention to the onboard SCSI and IDE controllers. Upgrading the Compaq system Flash to the latest supported version didn’t help matters.

Next step: disable the IDE controller card entirely in the SCU and let Windows 2000’s Plug & Play mechanism configure it instead. For a while, it looked like this would work: Windows correctly detected the card and installed the latest drivers, which I had downloaded from their website. However, it point blank refused to find any drives attached to the card. (I knew the drive was there since the BIOS had correctly spotted it earlier.)

After much googling, and reading discussion threads from others who had tried and failed to get an IDE drive going in a ProLiant, I was almost ready to give up. However, there was one thing left to try: a different IDE controller card.

After a little research, I came across the MRI RAID IDE Controller which is based on a Silicon Image/CMD SiI0680 chipset, rather than ITE. After it arrived, I swapped out the ITE card for the new one, disabled it in the SCU, and once again let Windows Plug & Play figure it out.

And this time … it worked! CMD’s IDE drivers must be a bit more robust/adaptable than the ITE drivers, since it had no trouble at all detecting the hard disk. Better yet, it recognised the full 160 GB of drive space.

Compaq have always made PCs that were almost but not quite completely compatible with normal PCs. While the ProLiant series have many fine attributes, incidents such as this help explain why I will never, ever, buy a Compaq server myself.

SysInternals Suite

Good news! All the free utilities from SysInternals.com are now available as a single convenient download here:

    Download SysInternals Suite [7.2 MB]

As you may or may not know, SysInternals was a website run by Mark Russinovich and Bryce Cogswell. Founded in 1996, it was the place to go for useful Windows utilities that did “hard” things – registry snooping, file monitoring, Rootkit detecters, etc.

For most IT professionals, utilities like Process Explorer, RegMon and FileMon have long been indispensable parts of their computer toolkit. (I have particular interest in the latter, since it performs a similar function to my own SnoopDos utility from 1989).

A few months ago, SysInternals was bought by Microsoft; Mark and Bryce are now working for Bill. One of the immediate changes was that all the utilities are now hosted on Microsoft’s website. A less visible change is that tools no longer come with source code. This is a huge blow to those of us who use the SysInternals tools as reference examples for a wide variety of programming techniques – for example, how to create virtual device drivers that can be installed without a reboot (greatly simplifying the installation process). Luckily, you can still find copies of the original source archives if you know where to look, though I expect that won’t last long.

The SysInternals tools are also exemplars of efficient coding, with executables sizes typically in the 100-500K range. In a world of ever-more-bloated programs, it’s nice to know that there are still people out there who care about such things.

It remains to be seen whether the ethos behind the SysInternals tools will change significantly as a result of the Microsoft takeover. I hope not, but the first worrying signs are already apparent. The removal of Linux versions of some tools is also a shame (though not a surprise).

In the meantime though, I salute Mark and Bryce for 10 years of supreme contribution to the Windows community. If you haven’t already downloaded the SysInternals Suite, do it now!

Some cool CES things

There were lots of impressive products at CES 2007, most of which have been covered in detail elsewhere.

Here are some that impressed me but haven’t received much coverage:

LG’s 3D Television

Buried deep within LG’s massive stand was a 42″ high-definition LCD screen displaying genuine 3D video. This has apparently been around for a few months, but it was the first I’d heard of it.

The effect is stunning – proper 3D with no special glasses required. As with traditional 3D displays (such as IMAX and motion simulator rides), it takes your eyes a second or two to adjust – then everything jumps into focus.

Of course, there are some limitations. The optimal viewing range is 3M-7M from the display, with a maximum viewing angle of around 30 degrees. All the video on display was computer generated; I’d have liked to see some live video as well.

The screen works by integrating 25 separate LCD panels, each of which produces a display with a very narrow viewing angle. The 25 views are positioned around the objects being viewed, so that standing in any position, your eyes will see only two views at a time (one for each eye). The panel is thicker than a normal flat panel, but not overly so – about 6-8″ deep.

Capturing live video to work with this system will require some advanced cameras, since it would need to record 25 images simultaneously, each at a slightly different perspective.

Regardless, this is deeply impressive technology – some day, no doubt, all TVs will look like this.

Smyth Research’s Surround Sound Headphones

If 3D television wasn’t enough, we stumbled upon 3D sound in the form of Smyth Research’s new virtual surround sound headphones. These aim to allow headphone users to experience a surround sound movie or album in all its glory, without alienating their neighbours.

While most DVD players and A/V receivers offer some form of downmixing to let you listen to surround sound movies on headphones, this new system is in a completely different class. A sensor mounted on the headphones identifies the position and orientation of the headphones in space, and the sound is adjusted accordingly.

When you turn your head to the left or right, audio coming from each speaker appears to stay at the same location within the room, rather than moving with your head as normally happens with headphones. The effect is amazing – the first time I heard it, I immediately took the headphones off to confirm that the soundtrack hadn’t simply been routed back to the main speakers again.

The system we saw used an infra-red transmitter positioned above the TV to send out a reference signal which a reciever on the headphones picked up, analysed, and then transmitted back to the sound processor via the audio cable. This approach allows several headphones to be used at once in a single room.

With the infra-red solution, listeners must not turn their head more than 60 degrees away from the TV, or the effect is lost. This can be overcome using an alternate RF positioning system.

Smyth Research don’t sell hardware products themselves; instead, they are licensing the technology to manufacturers for inclusion in their audio equipment. The demo we heard was using Yahama equipment; expect products to be on the market by the end of 2007.

As an aside, Smyth Research are based in Bangor, Northen Ireland – it was nice to see some Irish representation at the show. They were previously involved in the development of the DTS Surround Sound system, so their audio pedigree is well established.

The Laser Mouse That Works On glass

A4Tech had a stand in the international section over at the Hilton, where they were showing their laser mouse that works on glass.

Most optical mice fail dismally when moving over a shiny or glass surface; not the A4Tech mouse. I spent some time playing with their demo setup, and can confirm that it worked just as well on glass as on other surfaces. A simple trick, no doubt, but extremely useful.

Apparently the mouse is already on sale in Korea – hopefully it will make it to Europe soon.

CNET’s Next Big Thing

Alongside the main CES convention, there were numerous seminars and panel sessions to discuss issues of interest to the consumer electronics industry. Most of these cost money to attend, but some were open to all CES participants.

The first session we attended (on Monday afternoon) was CNET’s Next Big Thing, a discussion about three hot areas of debate in the industry at the moment:

  • Whether there is a genuine market for displaying video on mobile phones and other small-screen devices
  • Whether streamed online content will lead to the death of the DVD
  • How Digital Rights Management for movies & music can evolve into something less customer hostile.

Several industry panelists (including a CNET reader flown in from the UK to represent the consumer) discussed these items at length; the audience then voted electronically on how they thought things would develop in the future.

CNET's Next Big Thing was a popular session

Mobile video

On mobile video, the verdict was: yes, there is a new and growing market for media playback on mobile devices, but access needs to become far less restrictive (both in price, and range of media available) before it will take off. Mobile video will supplement existing markets rather than replacing them.

Dynamic content like news clips, music videos, and short length TV programs are more suited to these devices than feature length movies. One panelist predicted that the production style of programs will change to match the constraints of the mobile devices – more talking heads, fewer panoramic landscapes, etc.

All reasonable points. However, the most important point was only briefly mentioned: mobile video need not be streamed. People use it to fill time when they would otherwise be bored (standing in a queue, waiting for something to complete), and it suffices to have pre-loaded content available on their mobile device; it’s not a big deal if it’s several days old.

With modern video codecs and ever-increasing flash capacities, it’s quite feasible to store multiple movies or (more likely) TV programs on your portable device. I think this trend is just beginning to gather stream.

The death of DVD?

The panel were (eventually) united in their view that DVD wouldn’t be going away anytime soon – certainly not by 2010, as had been suggested. The tangible nature of a DVD, which makes it collectable, suitable for gifts, easy to use and distribute, etc. all combine to make it unlikely to disappear.

However, the panel did did agree that online streaming media will gain significant following as well. (No surprises there.)

I’m broadly in agreement with this view. It takes a long time for a given technology to die out; after all, there are still plenty of VHS tapes around, even though it is now difficult to buy a VHS recorder.

I also doubt we’ll see broad acceptance of streamed video until most households have a fast broadband connection (where ‘fast’ means 10 MBps or higher, with no contention).

Digital Rights Management

Digital Rights Management (DRM) was the most contenious discussion, with representation from the legal profession, movie industries, and consumers, as well as several industry observers. Everyone agreed that some sort of rights management was essential, but making it transparent enough so that consumers were not unduly restricted when they tried to use media they believed they owned was judged difficult.

Apple’s ITunes was given as an example of how to make DRM straightforward, so that customers are not always even aware that any special restrictions apply.

The biggest hinderance to existing DRM technology is that it goes against the grain of what consumers are used to. If you buy a CD, you can take it to a friend’s house to share with them or loan to them. You can play the same CD in your living room, your bedroom, your car, on your laptop, and anywhere else that has a CD player.

This is almost never true for media protected with DRM, especially where that media has been downloaded rather than delivered in some sort of physical packaging. While providers are entitled to protect their content, they need to find a less onerous way to do so than has currently been attempted.

I have no doubt that this problem will be solved; until then though, consumers will continue to be highly suspicious of anything resembling DRM.

How big is that hard disk?

Seagate had a cute visual exhibit at CES 2007 – a typical laptop hard disk, represented as an equivalent number of CDs (assuming you only store MP3s, of course.)

Lost: 1 Laptop. Also lost: 52,716 MP3s.

Seagate’s point, of course, is that you might want to think about backing up that hard disk … and naturally, they had a product designed to do exactly that.

CES 2007

I’ve just returned from the Consumer Electronics Show, which ran from January 8-11 in Las Vegas. This was my first time attending the largest trade show in America (2700 exhibitors, 150,000 attendees) and it was quite an experience. Over the next few days, I’ll post entries about some of the more interesting products and companies I saw.

I was travelling with some colleagues, and we had four days to cover the exhibition – just as well, since it covers ten large convention halls, and conference space in several nearby hotels.

The scale of CES is hard to grasp – by lunchtime on the first day, we had cleared half of one hall and felt we were doing well – until we realised we had only cleared half of one corner of one hall!

It was time for a more aggressive approach. By planning ahead, and stepping up the pace substantially, we managed to cover about 90% of the exhibits during the first three days! This left Thursday morning for some final follow-up visits to stands, and we were all done by lunchtime, leaving the afternoon for some R&R.

The conference was well organised, with courtesy coaches between the different venues, free Internet access available in the halls (supplied by DivX), and even free international phone calls courtesy of Vonage, who had installed VoIP kiosks in the lobbies. Complimentary magazines from just about every consumer and electronics publisher you’ve ever heard of were also available outside the conference entrance, along with the six show guides and product directories.

While most of the halls were in the gargantuan Las Vegas Convention Center, one hall was in the Sands Expo Centre, about a mile away. (The Adult Entertainment Expo was taking place in an adjacent hall during the week, which led to an interesting mix of attendees in the areas outside the hall.)

General Trends

Two products were in evidence just about everywhere this year – large high-definition LCD televisions, and digital photo frames. Even manufacturers who didn’t sell photo frames were using them to display information sheets for their actual products.

There was a wide and varied selection of Windows Media Centre PC cases from assorted vendors – perhaps the most impressive (if not the most practical) were the Diamond Media Centre range from Moneual, which featured a jewel-encrusted fascia panel and control knobs at prices ranging from $30,000 to $1,000,000, they are for elite purchasers only.

Moneual's diamond-encrusted Media Centre PC

While most of the larger vendors had extravagant stands with large floor areas, there were plenty of smaller stands there as well. The entry level cost for a stand is in the region of $6000-$8000 depending on location (about 2 metres x 3 metres) which is within reach for even small startups.

There were surprisingly few freebies on offer. Other than the ubiquitous carrier bags, and a few trinkets like LCD pocket torches, watch batteries, and T-shirts, pickings were slim.

Apple steal some thunder

One of the biggest announcements didn’t even happen at the show. Apple were running MacWorld in San Francisco during the same week, and launched their new and long-awaited iPhone, though it won’t be available until June. While the industry was impressed with the product, there was plenty of scepticism about Apple’s ability to break into the highly competitive cellphone market, especially with a product that costs $600! Time will tell…

Virus Creation in The Lab

The US magazine Consumer Reports (similar to Which? magazine in the UK) has been in the technology news recently. As part of a comprehensive test of antivirus software packages, they commissioned a consulting company to create 5,500 new viruses to see how well market leading programs would cope.

The antivirus industry, led by McAfee, was immediately up in arms when they heard about it. Imagine the risk to society of these viruses escaping into the wild! What blatant disregard for consumer safety! And other similar scaremongering…

It only takes a little scratching below the surface to show that their concerns are, at best, misguided. The viruses created for Consumer Reports were simple modifications of existing viruses, altered so that their signature was no longer identifiable. The viruses were kept in a secure environment, and all copies were removed after testing – only a single CD remains, which is kept in a locked and secure cabinet on site.

Surprise, surprise – McAffee’s package didn’t do particularly well in the test; it relies heavily on a signature database to identify new threats. When viruses were still something of a novelty, this approach worked well – it often took weeks before a new virus gained notoriety, giving McAffee plenty of time to respond.

By now however, it is so easy for would-be virus writers to develop new viruses, and variants on existing viruses, that a pure signature-based approach is no longer sufficient. A more pro-active approach is needed, that can identify virus-like behaviour and quarantine or block the affected program. Of course, there will be legitimate tools which end up looking like a virus – commercial tools can be recognised and permitted explicitly, while a mechanism can be included to allow users to grant access to other programs on an as-needed basis.

Maybe the industry should use two distinct terms – “Virus removal”, for packages that can remove existing viruses which are already known to the program, and “Antivirus” for packages that can detect new virus strains and prevent infection in the first place. (Somehow, though, I can’t imagine vendors thinking this is a good idea.)

Whenever Which? reviews product categories that I know well, I find myself disagreeing with their conclusions; this doesn’t give me much confidence in their reviews of other products that I’m not familiar with. People I trust have made similar comments about Consumer Reports. In this case, however, they’re on the side of right. More power to them…

(In case you’re wondering, the top rated antivirus packages were from BitDefender and ZoneLabs. The full report is only available to subscribers.)