IT-DISCUSS Archives

January 2006

IT-DISCUSS@LIST.UVM.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Steve Cavrak <[log in to unmask]>
Reply To:
Technology Discussion at UVM <[log in to unmask]>
Date:
Fri, 20 Jan 2006 08:33:01 -0500
Content-Type:
text/plain
Parts/Attachments:
text/plain (196 lines)
'Virtualizing' and other trends to come in IT world
By: Dan McLean
IT World Canada  (18 Jan 2006)
As published in The Globe and Mail
http://www.globetechnology.com/servlet/story/RTGAM. 
20051222.gttweinsider22/BNStory/Technology/

'Tis the season for prognostications and a time to consider what's  
likely to be
important in the world of information technology in 2006.

Such predictions aren't simple considerations, given the overblown  
hype of
everything IT. Listen to the constantly churning gristmill of  
marketers and
you'd swear there isn't a lousy or non-relevant technology anywhere.  
But which
IT products are truly meaningful, particularly for small businesses?
	
Looking back on a year's worth of IT trends gives some indications as  
to what
might matter in the 12 months ahead. Admittedly it's no more than a  
guessing
game, since the majority of big IT ideas ultimately wind up losers  
and long
forgotten. Think of the Larry Ellison "thin client," the Apple  
Newton, Windows
Millennium Edition, switched token-ring technology and BOB (the  
graphical user
interface add-on for Windows 3.1) .

There are more than enough turkeys for this holiday season, but let's  
consider
what technologies and IT trends are likely to matter most -- at least  
within the
time frame of the coming year. Here are five possibilities.

* Dual-core multiprocessing: Two or more microprocessors plugged into  
the same
chipset are definitely better than one when it comes to computing.  
That's what
dual-core multiprocessing is about, and it's finally here for desktop  
computers.
The unofficial unveiling of dual-core for the mainstream happened in  
2005, and
next year should see it take off in servers and workstations aimed at  
smaller
businesses.

Dual-core is now fully embraced by chipset giants Intel Corp. and  
Advanced Micro
Devices Inc. By the end of 2006 it will be the microprocessing  
standard. That's
confirmed by Intel, a company that reports, "70 [per cent] to 85 per  
cent of all
new desktop, mobile and server processor shipments will be dual-core  
by the end
of 2006."

Dual-core brings computing like never before -- true multitasking  
ideal for
computing jobs like real-time information searches that run in the  
background
while other processes and operations continue to chug away. Dual-core  
processing
also sets the stage in 2006 for the introduction to smaller business of
enterprise-scale, complex computational applications like customer  
relationship
management, decision support, and graphics creation and editing,  
among other
things -- and at a price similar to that of single-core chips.

* 64-bit computing: The next generation of computing in small  
business comes in
the form of dual-core and 64-bit processors like the AMD Opteron and  
Intel's
Itanium and Xeon chipsets. These hunks of silicon feature king-sized  
memory
caching (a theoretical 16 billion gigabytes of memory versus the 4 GB  
capacity
of 32-bit systems), and eliminate the archaic need for continual data  
swapping
from slow hard drives. That means a lot more data can be stored in a  
64-bit
memory cache to make data searches and data retrieval a lot quicker,  
which is
important for working with an extremely large database or other types  
of massive
files.

With the anticipated general release of Microsoft Corp.'s Vista  
operating
system, which is planned for late 2006, 64-bit computing will get the  
push it
needs to make its mark. A plethora of 64-bit-based applications will  
surely follow.

* WiMax: As the communications world sheds the shackles of wireline,  
the time is
now for a wireless networking technology like WiMax. There's the  
promise and
potential of fantastic high speeds and range -- 75 megabits per  
second (Mbps)
over 50-kilometre spans under optimal conditions.

In 2006, the technology will break ground as a fixed-point wireless  
link,
delivering something closer to 2 Mbps and building from there. The first
commercial products incorporating the WiMax 802.16 specification  
standard
reached the market this year and communication services are coming.  
Earlier this
month, a plan was revealed by an organization called the Alberta  
Special Areas
Board and Nortel Networks Corp. to roll out in 2006 a network that  
will offer
WiMax-based services across 21,000 square kilometres of rural  
southeastern
Alberta. Expect similar initiatives to follow as WiMax makes a  
breakthrough in
broadband wireless communications technology for rural areas. The  
country's
massive real estate makes Canada a natural proving ground for a high- 
speed
wireless technology like WiMax.

% Virtualization: Now you see it, but you really don't. That's  
virtualization
for you. It's scattered computing power and application resources  
brought to
bear as a single high-performance resource. It's IT not necessarily  
sitting in
one place, but scattered everywhere and tied together by software  
that links
processing and application functions into a managed collective to  
build a big
engine made of smaller separate pieces. "Virtualizing" of everything  
-- from
computer processing to data storage, from network communication  
systems to
distributed application function -- is what will matter in 2006. It's  
already
being seen in concepts such as grid computing, and although the value  
of that
concept can't quite be envisaged and applied by most small businesses  
yet, the
trend toward such utilitarian computing is clear.

Virtualization recognizes there's a lot of untapped and unused computing
potential out there already, and companies aren't interested in  
buying more.
Instead it's a matter of making better use of what they've already  
got. That's
what business wants, and in 2006 it's what they'll get -- through  
virtualization.

* The IT Utility: On a related note, Nicholas Carr, the controversial  
Harvard
Business Review academic who two years ago wrote that "IT doesn't  
matter," this
year asserted that it's the "end of corporate computing." The time,  
he says, is
now for the emergence of computing delivered as utility-type services --
capabilities that customers pay for based on usage.

I think Mr. Carr may be right. Utility-type computing services are in  
lots of
places and, although the world is a long way from plugging into a  
ubiquitous
computing power grid, little IT power plants are springing up all over.
Companies like International Business Machines Corp. and Sun  
Microsystems Inc.
have been among those championing the concept. Just this month,  
Hewlett-Packard
Co. joined the crowd, announcing infrastructure and application- 
provisioning
services for those businesses that need the use of servers and  
applications to
handle temporary surges in computing demand. Expect more IT vendors  
to follow
and more services to appear in 2006.

Concepts like virtualization and computer processing and storage-area  
network
"grids" are among the most obvious examples of how IT gets built in  
utility-like
models. What makes buy versus build particularly compelling for  
smaller business
is simply the challenge for many that they cannot keep IT up to  
snuff. A need
for higher availability and performance in computing is pushing IT
infrastructures of many smaller businesses beyond their limits.

It's why many would rather not do IT themselves. In 2006, they may  
not have to.

ATOM RSS1 RSS2