With the Intel Developer Forum behind it, Intel reported third-quarter revenue of $9.4bn, beating analysts' expectations. The company reported operating income of $2.6bn, net income of $1.9bn and earnings per share of 33 cents.

"As we look ahead, Intel's game-changing 32nm process technology will usher in another wave of innovation from new, powerful Intel Xeon server platforms to high-performance Intel Core processors to low-power Intel Atom processors," says Paul Otellini, Intel president and CEO. Computer Weekly asked other suppliers to gaze into their crystal balls.

GPU v CPU

For so long, the CPU has been king for powering everything from simple bat-and-ball games on early PCs through to crunching through huge quantities of records in databases. Though the death knell was sounded in the 1970s for the von Neuman model, it has still largely persisted, to the extent that manufacturers stitch CPU cores together to make them look like a single central processing facility.


Neveretheless, graphics has been found to be very greedy for CPU cycles and separate GPUs (graphics processing units) have become commonplace, initially as separate slot-in cards and lately bundled on most motherboards, although basic integrated graphics are not coping with the visual requirements of modern operating systems and hardware. However, in the first quarter of 2010, Westmere, Intel's collective term for its new 32nm server processors, will integrate the graphics core previously implemented on a separate chip set into the same CPU package. A year on, and the graphics will be implemented on the same die.

The purpose of a GPU is to effect as many pixels (for HD television) or polygons (for games) as possible. "A GPU is suited to massively parallel processing tasks. The more things you can throw at a graphics chip to do at once, the happier it is," says Benjamin Berraondo, product PR manager at GPU supplier nVidia. "Over five years ago, Moore's Law started to hit a wall when ever-faster CPUs were starting to overheat, so first dual-core processors were developed, then quad-core.

"GPUs were going multicore since their inception, and 240-core GPUs are currently available using 256-bit cores to provide extremely high memory throughput. A GPU can be used for more than just graphics, though, and can be programmed for any parallel processing tasks like financial modelling, science and engineering."

For instance, BNP Paribas Corporate and Investment Banking (CIB) says it has boosted the capabilities of a supercomputer that simulates the behaviour of financial markets for the bank's Global Equities and Commodity Derivatives (GECD) group by a factor of 15 while lowering power consumption by a factor of 190.

About one teraflop (one thousand billion calculations per second), has been transferred to a GPU-based platform, providing a 100-fold increase in the amount of calculation achieved per Watt. The new platform is based on two nVidia Tesla S1070 GPUs consuming 2kW and will replace more than 500 traditional CPU cores consuming 25kW.

"We are extremely pleased with this performance, which significantly exceeds our initial expectations. We hope to transfer more calculations to the GPU architecture in the near future" said Stéphane Tyc, Head of GECD quantitative research, BNP Paribas CIB.

Circumventing system crashes

Among IT departments and end-users, little has done more damage to Microsoft's reputation than the "blue screen of death". In many instances, the only option is to reformat the disc at low level and reinstall Windows. Having their hardware incapacitated by software has long infuriated users. Now it looks like a solution has been found, not by Microsoft in software, but by Intel in hardware.

In a similar way that allows a user to get access to a dead PC's hardware through its BIOS, Intel's vPro technology allows IT departments to reinstall Windows, or to install new patches, over an IP network without having to send engineers out on the road. It provides OS-absent manageability and down-the-wire security, even when the desktop or notebook is switched off, the OS is unresponsive, or software agents are disabled.

Steve Shakespeare, Intel director of EMEA enterprise solutions, says, "Imagine vPro as a computer inside a computer. As long as the machine has power and a network connection, it can be fixed remotely from a management console within the IT department. You can re-flash the BIOS, put new firmware into it, and then restart it.
"vPro has already been introduced in notebooks and will also be included in the Core i7 and i5 processors which will launch early next year. They will use the same 45nm technology currently deployed in the server space."

London Underground operator Tube Lines' desktop and notebook PC infrastructure was reaching its end of life, with most of the hardware almost six years old. Together with Intel and its IT services provider Capgemini, Tube Lines found HP hardware powered by the Intel Core 2 processor with vPro technology delivered the best overall performance.

Adrian Davey, head of IT at Tube Lines, says, "By working in close partnership with Intel and Capgemini we have received a phenomenal level of service. The dynamic desktop management enabled by Intel vPro technology is having an extremely positive impact on both our corporate green philosophy and our bottom line."

SSD v Braidwood

A report by Objective Analysis found that Intel's Braidwood technology offers comparable performance and power consumption improvements to solid state drives (SSDs), but at considerably lower costs.

However Intel's Shakespeare was strangely dismissive about Braidwood, saying Intel had no new announcements on it, or when it would be productised. "If you have a vPro-based machine and looking for accelerated disc support, SSD can help with that. I have a Centrino-based machine with an SSD hard drive replacement, and I am delighted with it."

London Borough of Hillingdon has been using a Compellent storage area network (San) for the past six months but recently updated it to include two SSD drives. "We were looking for a new storage solution that could automatically manage data and drive down the cost per terabyte of stored data over the life of the San. It had to provide us with affordable system resilience and also contribute to a greener IT infrastructure," says Roger Bearpark, assistant head of ICT at Hillingdon.

Hillingdon faced a problem common among many organisations. Although the borough was storing terabytes of data, only 10% could be considered mission-critical at any time. However, the other 90% could not be moved to offline storage as it could be required at short notice to form the basis of national government reports or to help solve problems with legacy projects.

The previous San could not automatically distinguish between primary and secondary data. This meant Hillingdon's IT team had to either spend time manually moving data or leave secondary data on expensive, high-speed discs
Bearpark says, "We have no directly attributable saving, as we introduced the initial SSD as an additional tier to supplement FC and SATA. What we will do is calculate the potential environmental/financial benefit of running SSD as a replacement for some of the FC disc and then look at further deployment in FY 2010/11.

"I am sure that certainly in portable and in some desktop devices Braidwood will be more attractive than SSD but in the storage arena it is not in the same league."

Gigabit to the desktop

In early October, the first England football match was streamed live over the internet instead of traditional television broadcast following the collapse of pay-TV operator Setanta. Bundling IP-based quality of service (QoS) functionality within gigabit switches is becoming increasingly common now that proprietary QoS implementations from the likes of Microsoft and Cisco have been superseded by a single converged QoS standard.

"The QoS standard is now ratified by the Internet Engineering Task Force," said Nigel Moulton, vice-president of business solutions at D-Link Europe. "QoS is now a function that you can turn on in the network to effectively prioritise certain traffic types over others. QoS will typically be implemented in an application-specific integrated circuit and configured in software by the user.

"For example, QoS is a requirement with voice traffic if you want low latency in order to eliminate jitter. If you want multi-way audio conferencing, QoS may be a requirement to guarantee voice quality, as well as video multicasting, where there is a requirement to split the audio stream from the video stream, and make sure that it all arrives at the destination at the right time.

"You have to be able to reassemble the audio codec, so QoS in the network gets over this problem by and large. Multicasting requires replication of a video stream, and that can be the responsibility of the network provider or the integrator who has designed the network for the customer."