Why using Identity is losing Identity

There are many things that computerized digital identity management
systems are going to be able to do … and many things they will
not. When I read this article about Amazon, I again thought about those who
believe that “I can control my identity” … I am not a strong believer
of this. Per my Second Axiom I believe that identity is given
to us by communities that we belong to. In addition, per this
article, I believe that the more that we use our identity, the more that we contribute to losing control of it!

What this article refers to is how Amazon is not only collecting information about its customers, but also the friends and family
of their customers. It appears that if I am an Amazon customer
and choose to have Amazon send a gift to someone, Amazon will begin to aggregate
information about them also! This also means that if my friends
or family choose to give me a gift via Amazon … people who I have
shared my identity information with … Amazon is beginning to
aggregate my identity! What is interesting is the depth of information
that Amazon is able to gather about me, or the friends and family of
its customers. It’s almost a form of “consensual phishing”!
Amazon simply asks its customers “Please provide us with a lot of
details about your friends and family!” … and we go ahead and enter
addresses, birthday information, etc. We sell out the identity of our own friends and family!

While I was at Novell working on digitalMe, I used to give a
variety of presentations where I would talk about the grocery store
cards that are given away to customers. I would ask the audience
how many people use these cards, and then follow up with a series of
questions:

What phase of the moon do you buy the most groceries? The grocery store knows. What foods do you buy the most during a full moon? They know that also. What month do you use the most toilet paper? Yep … they know that. What do you feed your family? Of course they know that. When did you have your first child? They know when the first diapers and baby food are purchased. How quickly are your children growing? Diapers come in easily tracked sizes. What color wrappers are you mostly likely to purchase? Ever thought about that? What shelf do you purchase the most from? Hmmm … think about that one!  What in-store advertising do you respond to?  Did you even consciously notice it?

People get the idea very quickly … the amount of information being
harvested about you is huge. If we take this in the Amazon
direction, there are all of the same questions that I could ask about
our behavior … what you are likely to click, what kinds of referrals convince you to purchase, etc.

The more that any of us interact with the world around us, we leave
behind a trail of identity information that not only identifies our
behaviors, it begins the process of spreading our identity over a
larger landscape … more and more places where we have little control
over it.

Amazon Knows Who You Are.
Many companies have systems for tracking customer habits, but Amazon
has collected info longer and used it more proactively. It now has
technology that tracks data on those you buy gifts for, and it reserves
the right to sell it all. [Wired News]

I was a Zombie for a short while!

This is a good article to read to understand what is out there on the
Internet right now … a whole lot of Zombies!  For those of you
that do not know what a zombie is, it is a computer that has been
compromised … hijacked … infected … taken over.  In most
cases, this has been done without the owner of the computer even
noticing.  In fact … the people who are responsible for creating
these zombies do not want the owners to know, and in most cases do not
want to harm the owner or their data!

Zombie’d computers are platforms for launching a wide range of attacks
on other computers on the Internet.  They simply have some little
software processes that are running in the background … most of the
time unnoticed.  This software is like a virus, but not to impact
the machine it is running on, but instead to allow a malicious user to
use it to launch various types of attacks on other computers or web
sites.

I have read numerous article that talk about large numbers of zombies
being used to attack gambling sites on the Internet … to shut them
down and extort large sums of money from them.

What is interesting is my experience last week … I set up a new
computer and plugged it into my Internet connection.  It was only
booted for less than 30 minutes … and I was downloading the various
security patches and updates … when I noticed a lot of network
activity.  After checking my new system, I found three unknown
processes running … and also found that it had hundreds of
connections to other computers all over the planet!  In just a
short amount of time on the Internet, my brand new workstation had
become a zombie!  I thought about what to do, and ended up
reformatting the hard disk and starting over … no idea what else
might have been compromised on the machine.

Over a Million Zombie PCs [Slashdot:]

Uh oh … now I understand C# …

Tonight is a news-reading and e-mail-reading evening. I’m way
behind on my reading and responding. I’ve been way too busy with
a new job, and I’ve been on the road. At the beginning of this
week, however, I was in a programming class and I learned C#. I’m
now moving all of my development to this new cross-platform language.

All of what I learned this week was in Microsoft Visual Studio. I
can not say enough about how impressed I am with the complete Microsoft
development environment. The creators of this development solution
ought to be proud of what they have created.

I am also downloading and installing all of the latest Mono tools to
begin the process of developing C# on Linux. I am looking
forward to tracking the progress of the Mono project, and all of the
various components. What I really like is that C# and the support
behind it appears to be a new language – and complete application
deployment platform – that will deliver where Java seemed to
stumble. C# is now being actively and completely supported on the
two biggest platforms on earth – Windows as the largest installed base
of machines, and Linux as the rapidly growing contender. No JVM
to download and install … no strange looking User Interface.

Anyhow … slightly off-topic … but I wanted to comment on
this. I have to admit that I see C# as a big deal in the next
decade!

C#, .NET, and Visual Studio … amazing.

I have to admit that I am once again amazed by the power of
Microsoft. I just completed my first Microsoft training course in
a long time … to learn the C# programming language. It was an
awesome experience.

I have a long background in developing software, starting with assembly
language, Fortran, Basic, C, and other languages. I never really
moved to Java, but knew that I wanted to learn a current object oriented
langauge. Over the last several years I have learned both Perl
and PHP, and these are impressive Open Source languages. When I
saw that the Mono project was getting going last year, I immediately
reealized that C# was the language to learn … from a C programmers
perspective.

After three days in class I now have a good understanding of C#, which
I plan to use for both Windows and Linux development. The Mono
project is the Open Source project to bring C# and .NET to Linux …
and obviously Microsoft has C#, .NET and their development environment
Visual Studio well established and moving forward. I will be
looking at Mono, but I realize that they have their work cut out for
them … Microsoft’s development environment is impressive.

I have developed in Visual Basic 6 on Windows for a long time, and I
found this to be a spectacular solution for developing Windows
applications. I was able to rapidly create a wide range of
applications over the years, complete with installers, with very little
effort. With all of this, I was spoiled when I had to deal with
text-mode development in Perl and PHP. I was really waiting for
this C# training … knowing that it was going to leverage a lot of the
same technologies.

Some of the core areas of Microsoft’s solution that I was most impressed with:

  • Visual Studio .NET 2003 –
    this is a very impressive Integrated Development Environment (IDE)
    solution. They have done a good job allowing for a lot of
    customization of the development environment. Once I had my
    desktop arranged, it was easy to flip between the visual UI designer,
    and the various code modules. Help was always there, and the
    Intellisense code completion was great. I admit that I wish it
    would complete using the tab key, instead of the Ctrl-Spacebar they
    require, however it is invaluable.
  • Database Connectivity and Development
    – it is beyond easy to develop complex applications that access a wide
    range of databases, and data sources. Within the Visual Studio
    IDE, most of the development can be done using wizards and simply
    dragging and dropping database tables from the Server Explorer.
    All of the code to integrate the data sources, and databases, into your
    code is just written for you. You end up being able to use the
    DataSet wizard to then create the DataSet. Again … all of the
    code is basically written for you … and you are left to focus on your
    core logic and functionality.
  • XML Manipulation – so
    far, I haven’t found anything that I can’t do with XML. In almost
    no time this morning, I was able to program an HTTP request to grab an
    RSS XML file from one of my blogs. I was then able to transform
    this XML file into a DataSet with one or two lines of code. From
    an XML file, to a set of database tables ready to be read.
  • SOAP Client – ok … now
    this was just too easy. I simply located the URI for a web
    service that I was interested in. I actually searched the
    Microsoft UDDI directory through the integrated browser. I found
    a stock quote web service, and clicked the link to add it to the
    project. The next thing that I know, I simply have a new service
    with a couple of new methods that I can call. I then link the
    results of the SOAP request to a DataGrid … and can view the results.
  • SOAP Web Service – now this was just too easy. I simply
    went through a Wizard to create the base service class, and then added
    a series of methods that immediately become web-services methods.
    As I added each new method to the class, the build process seems to
    simply re-deploy and everything worked.
  • ASP.NET – now this was the final aspect of the course that I
    received today. I am absolutely blown away at hope simple it is
    to create complex sets of interactive pages.

Now … I am completely open to Mono and very interested in it’s
success … however I now have a model product that they are going to
have to beat. Visual Studio and Microsoft are at the forefront
of development with their offering . I’ll post about my
further experiences … and my experiences with Mono!

Novell’s Open Source Technology Center

This morning at Brainshare the first keynote was the Governor of Utah,
Jon Huntsman Jr., and he discussed a lot of information about
Utah. I started to wonder why he was talking at Brainshare to a
bunch of geeks, however he finally touched on the core aspect of his
talk … the newly announced Open Source Technology Center.

I really like the idea, and I look forward to finding out more about
what they are going to offer, and at what cost. I have been
working with folks at the Miller Business Innovation Center
in the Salt Lake Valley, and I have learned alot about how they
structure their deals and offerings. I am currently working on
some start-up ideas that might benefit from such a location.

The funny part is the “other side” of this new announcement. This
is what Novell is going to do with all of the “empty buildings” on the
Provo campus. With all of the layoffs, and the shift of power to Cambridge, the Provo campus has started to become a ghost town.

Slashdot slashdotted by eTech

When
I was reading my aggregator the last day of eTech, I found these
posts in my page of new articles.  I started to wonder “How the
heck is my aggregator
going crazy?  What is going on here?  I’m not doing this!”
… and then I realized what was up.  At eTech, all of the
attendees were on the wireless network behind a NAT.  To Slashdot,
it must have looked like a lot of requests for their RSS feed from the
same address.  Slashdot thought this was all traffic coming from a
single user … and so they pitched the error messages out.

It’s funny to see yet another way in which technology confuses
technology.  I’m not sure how this was solved … someone must
have contacted Slashdot to let them know.  To Slashdot, they only
saw the one “identity” and assumed that it was a single user hammering
their servers.  Yet another case where some sort of solution could
be developed to encode identity into the RSS request.

Funny …

Fedora Core 1 upgrades and Sendmail

I have slowly been upgrading all of my old RedHat boxes to Fedora Core
1.  I know that this is even old, however this is a tested
configuration for what we wanted to do on our wireless network
infrastructure, and there are some known problems with moving to the
v2.6.x Linux kernel.  I don’t want to deal with those yet.

I have now done three upgrades, using the anaconda installer that comes
with Fedora Core, and I have to say that I am impressed.  It just
works.  Except for Sendmail.  In each install that I have
done, sendmail just stops working, and begins to emit useless errors
into the log … or at least they are useless to me.  On this
latest upgrade, I have spent hours of time debugging the installation
over the last two or three weeks.

Today I was able to find a simple solution to debugging these
issues.  I’m not sure why I didn’t think of this before.  I
simply used “rpm” to erase/uninstall sendmail … and then used
“up2date”  to install it again.  Jackpot!  Sendmail is
now working on this newly upgraded server.  I’m not going to
forget this “solution.”

Wow … it’s almost like rebooting Windows!