About Scott C. Lemon

I'm a techno futurist, interested in all aspects of humanity, sociology, community, identity, and technology. While we are all approaching the Singularity, I'm just having fun effecting the outcomes of the future!

I was a Zombie for a short while!

This is a good article to read to understand what is out there on the
Internet right now … a whole lot of Zombies!  For those of you
that do not know what a zombie is, it is a computer that has been
compromised … hijacked … infected … taken over.  In most
cases, this has been done without the owner of the computer even
noticing.  In fact … the people who are responsible for creating
these zombies do not want the owners to know, and in most cases do not
want to harm the owner or their data!

Zombie’d computers are platforms for launching a wide range of attacks
on other computers on the Internet.  They simply have some little
software processes that are running in the background … most of the
time unnoticed.  This software is like a virus, but not to impact
the machine it is running on, but instead to allow a malicious user to
use it to launch various types of attacks on other computers or web
sites.

I have read numerous article that talk about large numbers of zombies
being used to attack gambling sites on the Internet … to shut them
down and extort large sums of money from them.

What is interesting is my experience last week … I set up a new
computer and plugged it into my Internet connection.  It was only
booted for less than 30 minutes … and I was downloading the various
security patches and updates … when I noticed a lot of network
activity.  After checking my new system, I found three unknown
processes running … and also found that it had hundreds of
connections to other computers all over the planet!  In just a
short amount of time on the Internet, my brand new workstation had
become a zombie!  I thought about what to do, and ended up
reformatting the hard disk and starting over … no idea what else
might have been compromised on the machine.

Over a Million Zombie PCs [Slashdot:]

Uh oh … now I understand C# …

Tonight is a news-reading and e-mail-reading evening. I’m way
behind on my reading and responding. I’ve been way too busy with
a new job, and I’ve been on the road. At the beginning of this
week, however, I was in a programming class and I learned C#. I’m
now moving all of my development to this new cross-platform language.

All of what I learned this week was in Microsoft Visual Studio. I
can not say enough about how impressed I am with the complete Microsoft
development environment. The creators of this development solution
ought to be proud of what they have created.

I am also downloading and installing all of the latest Mono tools to
begin the process of developing C# on Linux. I am looking
forward to tracking the progress of the Mono project, and all of the
various components. What I really like is that C# and the support
behind it appears to be a new language – and complete application
deployment platform – that will deliver where Java seemed to
stumble. C# is now being actively and completely supported on the
two biggest platforms on earth – Windows as the largest installed base
of machines, and Linux as the rapidly growing contender. No JVM
to download and install … no strange looking User Interface.

Anyhow … slightly off-topic … but I wanted to comment on
this. I have to admit that I see C# as a big deal in the next
decade!

C#, .NET, and Visual Studio … amazing.

I have to admit that I am once again amazed by the power of
Microsoft. I just completed my first Microsoft training course in
a long time … to learn the C# programming language. It was an
awesome experience.

I have a long background in developing software, starting with assembly
language, Fortran, Basic, C, and other languages. I never really
moved to Java, but knew that I wanted to learn a current object oriented
langauge. Over the last several years I have learned both Perl
and PHP, and these are impressive Open Source languages. When I
saw that the Mono project was getting going last year, I immediately
reealized that C# was the language to learn … from a C programmers
perspective.

After three days in class I now have a good understanding of C#, which
I plan to use for both Windows and Linux development. The Mono
project is the Open Source project to bring C# and .NET to Linux …
and obviously Microsoft has C#, .NET and their development environment
Visual Studio well established and moving forward. I will be
looking at Mono, but I realize that they have their work cut out for
them … Microsoft’s development environment is impressive.

I have developed in Visual Basic 6 on Windows for a long time, and I
found this to be a spectacular solution for developing Windows
applications. I was able to rapidly create a wide range of
applications over the years, complete with installers, with very little
effort. With all of this, I was spoiled when I had to deal with
text-mode development in Perl and PHP. I was really waiting for
this C# training … knowing that it was going to leverage a lot of the
same technologies.

Some of the core areas of Microsoft’s solution that I was most impressed with:

  • Visual Studio .NET 2003 –
    this is a very impressive Integrated Development Environment (IDE)
    solution. They have done a good job allowing for a lot of
    customization of the development environment. Once I had my
    desktop arranged, it was easy to flip between the visual UI designer,
    and the various code modules. Help was always there, and the
    Intellisense code completion was great. I admit that I wish it
    would complete using the tab key, instead of the Ctrl-Spacebar they
    require, however it is invaluable.
  • Database Connectivity and Development
    – it is beyond easy to develop complex applications that access a wide
    range of databases, and data sources. Within the Visual Studio
    IDE, most of the development can be done using wizards and simply
    dragging and dropping database tables from the Server Explorer.
    All of the code to integrate the data sources, and databases, into your
    code is just written for you. You end up being able to use the
    DataSet wizard to then create the DataSet. Again … all of the
    code is basically written for you … and you are left to focus on your
    core logic and functionality.
  • XML Manipulation – so
    far, I haven’t found anything that I can’t do with XML. In almost
    no time this morning, I was able to program an HTTP request to grab an
    RSS XML file from one of my blogs. I was then able to transform
    this XML file into a DataSet with one or two lines of code. From
    an XML file, to a set of database tables ready to be read.
  • SOAP Client – ok … now
    this was just too easy. I simply located the URI for a web
    service that I was interested in. I actually searched the
    Microsoft UDDI directory through the integrated browser. I found
    a stock quote web service, and clicked the link to add it to the
    project. The next thing that I know, I simply have a new service
    with a couple of new methods that I can call. I then link the
    results of the SOAP request to a DataGrid … and can view the results.
  • SOAP Web Service – now this was just too easy. I simply
    went through a Wizard to create the base service class, and then added
    a series of methods that immediately become web-services methods.
    As I added each new method to the class, the build process seems to
    simply re-deploy and everything worked.
  • ASP.NET – now this was the final aspect of the course that I
    received today. I am absolutely blown away at hope simple it is
    to create complex sets of interactive pages.

Now … I am completely open to Mono and very interested in it’s
success … however I now have a model product that they are going to
have to beat. Visual Studio and Microsoft are at the forefront
of development with their offering . I’ll post about my
further experiences … and my experiences with Mono!

Novell’s Open Source Technology Center

This morning at Brainshare the first keynote was the Governor of Utah,
Jon Huntsman Jr., and he discussed a lot of information about
Utah. I started to wonder why he was talking at Brainshare to a
bunch of geeks, however he finally touched on the core aspect of his
talk … the newly announced Open Source Technology Center.

I really like the idea, and I look forward to finding out more about
what they are going to offer, and at what cost. I have been
working with folks at the Miller Business Innovation Center
in the Salt Lake Valley, and I have learned alot about how they
structure their deals and offerings. I am currently working on
some start-up ideas that might benefit from such a location.

The funny part is the “other side” of this new announcement. This
is what Novell is going to do with all of the “empty buildings” on the
Provo campus. With all of the layoffs, and the shift of power to Cambridge, the Provo campus has started to become a ghost town.

Slashdot slashdotted by eTech

When
I was reading my aggregator the last day of eTech, I found these
posts in my page of new articles.  I started to wonder “How the
heck is my aggregator
going crazy?  What is going on here?  I’m not doing this!”
… and then I realized what was up.  At eTech, all of the
attendees were on the wireless network behind a NAT.  To Slashdot,
it must have looked like a lot of requests for their RSS feed from the
same address.  Slashdot thought this was all traffic coming from a
single user … and so they pitched the error messages out.

It’s funny to see yet another way in which technology confuses
technology.  I’m not sure how this was solved … someone must
have contacted Slashdot to let them know.  To Slashdot, they only
saw the one “identity” and assumed that it was a single user hammering
their servers.  Yet another case where some sort of solution could
be developed to encode identity into the RSS request.

Funny …

Fedora Core 1 upgrades and Sendmail

I have slowly been upgrading all of my old RedHat boxes to Fedora Core
1.  I know that this is even old, however this is a tested
configuration for what we wanted to do on our wireless network
infrastructure, and there are some known problems with moving to the
v2.6.x Linux kernel.  I don’t want to deal with those yet.

I have now done three upgrades, using the anaconda installer that comes
with Fedora Core, and I have to say that I am impressed.  It just
works.  Except for Sendmail.  In each install that I have
done, sendmail just stops working, and begins to emit useless errors
into the log … or at least they are useless to me.  On this
latest upgrade, I have spent hours of time debugging the installation
over the last two or three weeks.

Today I was able to find a simple solution to debugging these
issues.  I’m not sure why I didn’t think of this before.  I
simply used “rpm” to erase/uninstall sendmail … and then used
“up2date”  to install it again.  Jackpot!  Sendmail is
now working on this newly upgraded server.  I’m not going to
forget this “solution.”

Wow … it’s almost like rebooting Windows!

GetRight … segmented downloading like BitTorrent

I have been using GetRight for a
long time. It is still, IMHO, one of the most amazing download
managers that has been written. It is to downloading, what ICQ is
to IM … the ultimate download manager with options and features
beyond what the average person could ever use.


Tonight, I was using it’s “mirrors” capabilites, and realized that it provided a “torrent”-like capability long before BitTorrent
was around. GetRight allows me to click a link in my browser, and
select for GetRight to handle the downloading. As the download
starts, I can then go and visit other mirror sites for the same file,
and click those links also. GetRight will automatically notice
that it is already downloading that file, and start a new connection to
the new source server … and split the download into “segments”.
In this example I am downloading four segments of the Fedora Core 3 CD
#4. I simply went to four different mirror servers and clicked
the link to download the same file from each one. GetRight
handled everything else!

It is intelligent software like this, that probably contributed to
ideas like BitTorrent. In this case, I am able to leverage the
various mirrors that exist to increase me download bandwidth …
without requiring things to be in a BitTorrent format. It’s funny
that I have been doing this for quite a while, but failed to think
about the similarity to BitTorrent.