Towards the 1TB (Terabyte) disk drive

Wow … the doubling continues. There are two key points that I
like about this article. The first is that 500GB PC disk drives
will be on the market this year. That is now 100,000 times the
capacity of my first hard disk drive that I ever had! The 100,000 times growth in capacity has occurred in less than 30 years.

The second key point is that it shows no sign of stopping. From this article:

Desktop drive capacity will top out at around 1 terabyte by late 2006,
before running into technological problems in maintaining data
stability.

We are on track to again double by the end of next year! It is
difficult to image that common, and eventually commodity, hard disk
drives will reach these sizes. That is a huge amount of
data. In addition, as humans we will simply solve the tough
problems and continue the growth rates with newer technologies.

A last point is that there are currently numerous solutions – the most
popular referred to as RAID – that allow you to aggregate multiple disk
drives into a redundant array that appears as one, even larger, disk
drive. I have been tracking solutions like this over the years to
mentally track the cost of a terabyte of storage. I am now
stunned that in the next year, I might be buying one terabyte of hard
disk storge in a single PC disk drive.

PC World: PC Drive Reaches 500GB.
Demand for greater capacity continues to rise due in large part to a
growing need for music and video storage on PCs and consumer
electronics devices. To meet that need, storage vendors are turning to
new recording technologies. The first of these, perpendicular
recording, will debut from Toshiba this year. [Tomalak’s Realm]

Up2date e-mail notifications

A while back I had started to experiment with a way to get e-mail
notifications from my servers when up2date detected that new packages
were available. I am running a series of Fedora Core 1,2, and 3
boxes and it seems that the updates come quite frequently.

I decided that this weekend I would sit down and write a new bash
script that could be run daily by cron. Here’s what I wrote:

#!/bin/bash
# First lets check with up2date …

# have up2date list the available packages …
up2dateOutput=`up2date –nox -l`

# now check to see if packages are listed …
firstUpdatePackage=`echo “$up2dateOutput” | grep -A1
“\-\-\-\-\-\-\-\-\-\-” | awk ‘{if (NR==2) print $1}’`
# take the
output,
# grep for the long hyphen divider
# grabbing that line and the
next line,
# awk the second line to see if there
# is a package name at the
beginning

#echo “First package: |$firstUpdatePackage|”

if [ ! -z “$firstUpdatePackage” ]
# there is a package name
then
  #echo “Sending e-mail …”
  nodeName=`uname -n` # get the host name
  mailSubject=”Up2date – “$nodeName
  # create the e-mail subject line
  `echo “$up2dateOutput” | mail -s “$mailSubject” root`
  # send the e-mail to root
fi
exit 0

So far, it appears to do exactly what I had hoped … an e-mail notice
when there are packages that can be updated on my servers!

AJAX and Smart Clients

I always think that it’s funny how people want to say that it’s
either/or. As someone who has been working with AJAX application
development for years, I’ll say that AJAX has it’s share of
issues. At the same time I’ll say that I can’t understand why
more web sites are not using AJAX!

AJAX provides a very rich and usable interface for web-based
applications. As both Internet Explorer and Mozilla/Firefox both
now support extremely rich DOM interfaces, and lots of support of lots
of standards, most web sites could be offering a lot more functionality
to their visitors. Another huge advantage of AJAX is impact on
servers. A good developer can take lot of load off their servers,
and improve response time, by pushing more computing down to the client
machine.

Smart Clients, however, are the future for many applications.
This is most evident with e-mail … probably the most ubiquitous
“Smart Client” application on the planet. We run our e-mail
application, and bring all of the mail down to a cache on our local
hard disk. We can read and write e-mail any time and any place
… connected or disconnected … and later synchronize to the servers.

In any case … it’s not about either/or … instead both of these
models are simply enhancements of what we are already using. Both
AJAX and Smart Clients are going to be with us for a while!

Mary Jo Foley reports on AJAX vs. SmartClient debate.

Mary Jo Foley: Could AJAX Wash Away ‘Smart Clients?’

[Scobleizer: Microsoft Geek Blogger]

I was a Zombie for a short while!

This is a good article to read to understand what is out there on the
Internet right now … a whole lot of Zombies!  For those of you
that do not know what a zombie is, it is a computer that has been
compromised … hijacked … infected … taken over.  In most
cases, this has been done without the owner of the computer even
noticing.  In fact … the people who are responsible for creating
these zombies do not want the owners to know, and in most cases do not
want to harm the owner or their data!

Zombie’d computers are platforms for launching a wide range of attacks
on other computers on the Internet.  They simply have some little
software processes that are running in the background … most of the
time unnoticed.  This software is like a virus, but not to impact
the machine it is running on, but instead to allow a malicious user to
use it to launch various types of attacks on other computers or web
sites.

I have read numerous article that talk about large numbers of zombies
being used to attack gambling sites on the Internet … to shut them
down and extort large sums of money from them.

What is interesting is my experience last week … I set up a new
computer and plugged it into my Internet connection.  It was only
booted for less than 30 minutes … and I was downloading the various
security patches and updates … when I noticed a lot of network
activity.  After checking my new system, I found three unknown
processes running … and also found that it had hundreds of
connections to other computers all over the planet!  In just a
short amount of time on the Internet, my brand new workstation had
become a zombie!  I thought about what to do, and ended up
reformatting the hard disk and starting over … no idea what else
might have been compromised on the machine.

Over a Million Zombie PCs [Slashdot:]

Uh oh … now I understand C# …

Tonight is a news-reading and e-mail-reading evening. I’m way
behind on my reading and responding. I’ve been way too busy with
a new job, and I’ve been on the road. At the beginning of this
week, however, I was in a programming class and I learned C#. I’m
now moving all of my development to this new cross-platform language.

All of what I learned this week was in Microsoft Visual Studio. I
can not say enough about how impressed I am with the complete Microsoft
development environment. The creators of this development solution
ought to be proud of what they have created.

I am also downloading and installing all of the latest Mono tools to
begin the process of developing C# on Linux. I am looking
forward to tracking the progress of the Mono project, and all of the
various components. What I really like is that C# and the support
behind it appears to be a new language – and complete application
deployment platform – that will deliver where Java seemed to
stumble. C# is now being actively and completely supported on the
two biggest platforms on earth – Windows as the largest installed base
of machines, and Linux as the rapidly growing contender. No JVM
to download and install … no strange looking User Interface.

Anyhow … slightly off-topic … but I wanted to comment on
this. I have to admit that I see C# as a big deal in the next
decade!

C#, .NET, and Visual Studio … amazing.

I have to admit that I am once again amazed by the power of
Microsoft. I just completed my first Microsoft training course in
a long time … to learn the C# programming language. It was an
awesome experience.

I have a long background in developing software, starting with assembly
language, Fortran, Basic, C, and other languages. I never really
moved to Java, but knew that I wanted to learn a current object oriented
langauge. Over the last several years I have learned both Perl
and PHP, and these are impressive Open Source languages. When I
saw that the Mono project was getting going last year, I immediately
reealized that C# was the language to learn … from a C programmers
perspective.

After three days in class I now have a good understanding of C#, which
I plan to use for both Windows and Linux development. The Mono
project is the Open Source project to bring C# and .NET to Linux …
and obviously Microsoft has C#, .NET and their development environment
Visual Studio well established and moving forward. I will be
looking at Mono, but I realize that they have their work cut out for
them … Microsoft’s development environment is impressive.

I have developed in Visual Basic 6 on Windows for a long time, and I
found this to be a spectacular solution for developing Windows
applications. I was able to rapidly create a wide range of
applications over the years, complete with installers, with very little
effort. With all of this, I was spoiled when I had to deal with
text-mode development in Perl and PHP. I was really waiting for
this C# training … knowing that it was going to leverage a lot of the
same technologies.

Some of the core areas of Microsoft’s solution that I was most impressed with:

  • Visual Studio .NET 2003 –
    this is a very impressive Integrated Development Environment (IDE)
    solution. They have done a good job allowing for a lot of
    customization of the development environment. Once I had my
    desktop arranged, it was easy to flip between the visual UI designer,
    and the various code modules. Help was always there, and the
    Intellisense code completion was great. I admit that I wish it
    would complete using the tab key, instead of the Ctrl-Spacebar they
    require, however it is invaluable.
  • Database Connectivity and Development
    – it is beyond easy to develop complex applications that access a wide
    range of databases, and data sources. Within the Visual Studio
    IDE, most of the development can be done using wizards and simply
    dragging and dropping database tables from the Server Explorer.
    All of the code to integrate the data sources, and databases, into your
    code is just written for you. You end up being able to use the
    DataSet wizard to then create the DataSet. Again … all of the
    code is basically written for you … and you are left to focus on your
    core logic and functionality.
  • XML Manipulation – so
    far, I haven’t found anything that I can’t do with XML. In almost
    no time this morning, I was able to program an HTTP request to grab an
    RSS XML file from one of my blogs. I was then able to transform
    this XML file into a DataSet with one or two lines of code. From
    an XML file, to a set of database tables ready to be read.
  • SOAP Client – ok … now
    this was just too easy. I simply located the URI for a web
    service that I was interested in. I actually searched the
    Microsoft UDDI directory through the integrated browser. I found
    a stock quote web service, and clicked the link to add it to the
    project. The next thing that I know, I simply have a new service
    with a couple of new methods that I can call. I then link the
    results of the SOAP request to a DataGrid … and can view the results.
  • SOAP Web Service – now this was just too easy. I simply
    went through a Wizard to create the base service class, and then added
    a series of methods that immediately become web-services methods.
    As I added each new method to the class, the build process seems to
    simply re-deploy and everything worked.
  • ASP.NET – now this was the final aspect of the course that I
    received today. I am absolutely blown away at hope simple it is
    to create complex sets of interactive pages.

Now … I am completely open to Mono and very interested in it’s
success … however I now have a model product that they are going to
have to beat. Visual Studio and Microsoft are at the forefront
of development with their offering . I’ll post about my
further experiences … and my experiences with Mono!

Slashdot slashdotted by eTech

When
I was reading my aggregator the last day of eTech, I found these
posts in my page of new articles.  I started to wonder “How the
heck is my aggregator
going crazy?  What is going on here?  I’m not doing this!”
… and then I realized what was up.  At eTech, all of the
attendees were on the wireless network behind a NAT.  To Slashdot,
it must have looked like a lot of requests for their RSS feed from the
same address.  Slashdot thought this was all traffic coming from a
single user … and so they pitched the error messages out.

It’s funny to see yet another way in which technology confuses
technology.  I’m not sure how this was solved … someone must
have contacted Slashdot to let them know.  To Slashdot, they only
saw the one “identity” and assumed that it was a single user hammering
their servers.  Yet another case where some sort of solution could
be developed to encode identity into the RSS request.

Funny …

GetRight … segmented downloading like BitTorrent

I have been using GetRight for a
long time. It is still, IMHO, one of the most amazing download
managers that has been written. It is to downloading, what ICQ is
to IM … the ultimate download manager with options and features
beyond what the average person could ever use.


Tonight, I was using it’s “mirrors” capabilites, and realized that it provided a “torrent”-like capability long before BitTorrent
was around. GetRight allows me to click a link in my browser, and
select for GetRight to handle the downloading. As the download
starts, I can then go and visit other mirror sites for the same file,
and click those links also. GetRight will automatically notice
that it is already downloading that file, and start a new connection to
the new source server … and split the download into “segments”.
In this example I am downloading four segments of the Fedora Core 3 CD
#4. I simply went to four different mirror servers and clicked
the link to download the same file from each one. GetRight
handled everything else!

It is intelligent software like this, that probably contributed to
ideas like BitTorrent. In this case, I am able to leverage the
various mirrors that exist to increase me download bandwidth …
without requiring things to be in a BitTorrent format. It’s funny
that I have been doing this for quite a while, but failed to think
about the similarity to BitTorrent.

Radio Love/Hate Relationship

Yes … once again … my Radio Love/Hate Relationship.  There are
so many things tht I love about Radio, however I am here at the
O’Reilly Emerging Technology Conference in San Diego, and I have not
been able to blog all week.  Well … I can actually blog, but I
can’t upstream.

I’m not sure why, but the Radio FTP is not working, and fails with a
stupid, non-descriptive error telling me that some sub-table doesn’t
exist.  By now, I know that means that I can’t get to the FTP
server.  There are just too many little problems like this that
really bug me.

What is Funny/Sad is that I have had many people here at the conference
tell me about Radio … and their frustrations.  I have heard
person after person talking about how they left the application due to
the same sorts of issues.  It’s sad to see the momentum that this
product had, fall by the wayside in disrepair.  It still has more
advanced features than anything else that I can find … but I am
slowly beginning my search for something else.

I hope Userland can do something soon … I feel that time is running out for Radio.