About Scott C. Lemon

I'm a techno futurist, interested in all aspects of humanity, sociology, community, identity, and technology. While we are all approaching the Singularity, I'm just having fun effecting the outcomes of the future!

WISP.Org … off to a slow start!

Ok … so the holidays got the best of me.  Followed by a lot of
work to prepare for deploying wireless at the Sundance Film Festival in
Park City, Utah.  And also maintaining the 80211.net
wireless infrastructure.  Working in wireless can keep you
busy.  However it’s always fun to learn something new, and
experiment with new equipment.

That is the kind of information that I had hoped to provide on this
site … experiences and information about deploying and operating
wireless internet services.  Over the next couple of weeks, I’ll
be introducing some of the people who will be contributing to our
forums.  These are all people from the industry, either WISPs or
manufacturers.  All looking to contribute and educate.

I’ll be doing my best to keep up with them, and to document some of the work that I am doing!

I’ll miss the dorkbot meeting tonight!

I’m still out in California at the VSLive event … it’s been
great. My friend Todd called to see if I was going to be around
tonight to go to a dorkbot meeting. What? Something new to see …

I visited the web site for the San Francisco chapter of dorkbot, and I have to admit that I am bummed that I am going to miss this. The tag line is “People doing strange things with electricity” … and the list of projects sure looks cool.

I’m going to have to come back. Or even better … I’ll have to start a chapter in Salt Lake City!

Goin’ to California …

I’m on my way to San Francisco to attend a conference as part of my new
job. I recently chose to join an awesome team of people at a
company called Agilix.
Agilix is revolutionizing the mobile and Tablet PC market with a
layered set of products. These products start with some of the
best tools to develop “Ink-enabled” applications for the Tablet
PC. You can go visit the Tablet PC developer site at Microsoft and download a copy. If you want to see the full breadth of what is possible, then come to the Agilix site and purchase the SDK!

The next layer up wil be the GoBinder SDK. This is a platform for
developing mobile applications and it provides a rich set of features
that ease development, and take care of many of the core tasks when
considering how to “mobilize” your application and data.

The top layer is the GoBinder product itself … a full blown
“organizer” that is currently being sold to university and colleege
students. This is a product that replaces the physical binder of
the old world, and replaces it with the same functionality on your
laptop or Tablet PC. For people that are familiar with the
Franklin-Covey set of software products, Agilix is the company that has
been writing these products for years.

Anyhow … I’ll blog more about the conference … this is going to be
fun. I look forward to exploring the Tablet PC and learning more
about how far they have come. It’s not a far leap from the
wearable computers that I have been experimenting with over the last
decade!

Flying again …

I haven’t been doing much commercial flying lately.  In fact, I’m
sitting here at the Salt Lake City Airport thinking that this is the
first time that I have flown in over six months!  When I was
leaving the house I had to do the standard “security review” to think
about all of the things that I might have thrown into my computer bag,
or my suitcase, while I haven’t been flying … all of the things like
a small screwdriver, my fingernail clippers, and a pair of tweezers.

Getting to the airport it wasn’t too bad … a good time of night to
fly out.  Of course I was bummed to see that only airlines like
JetBlue understand the value of free wireless.  I got a signal
from the Sprint wireless network, in partnership with the Salt Lake
City Airport, and they only wanted $9.95 for 24 hours!  Yeah …
right … can I PLEASE pay that much for Internet Access?  What
century are these guys living in?  It’s a form of extortion in my
opinion.  If you are going to charge, then make it a reasonable
price!  I would pay $10 per month … that’s what we charge on our wireless network at 80211.net.

Oh well … I’ll have bandwidth when I arrive …

A Fourth Axiom of Identity

I can completely understand the natural human tendency to want to claim
ownership of our identity.  I constantly see the statements “It’s mine!  I want to control it!  I want to determine who can see it!”

After years of looking at this space, however, I have become convinced
that our identity, as we know it, is already spread across the
communities that gave us that identity.  That the basis of the First and Second Axioms that I posited.  I see identity as an accumulated thing … and it only exists with language.  It is language that allows us to distinguish identity.

It is this train of thought that takes me to the Fourth Axiom of Identity:

I posit that for an
effective community to exist there must be verified agreement, which
requires a minimum of three community members.

As I stated above, I believe it is language that allows us to
distinguish identity.  Language requires agreement on the meaning
of words.  The only way to have verified agreement is to have a
third party as a “tiebreaker.”  This does not prevent
disagreement, however it allows for the verification of a previous
agreement.

Where I see this relating to identity is in the area of verification, or authentication
of identity attributes.  Here, I mean authentication as in
“verifying the authenticity of.”  I can always choose to accept
the identity information that I receive from another entity, however
the value and accuracy of that identity information is always in
question until I am able to verify it.  How I verify identity
information will always occur using some other member of a common
community that is able to provide the verification.

When applying for a financial loan, I can tell a bank about my employer
and salary information, however they are probably going to verify the
information with my employer.  All three of us exist within a
common community that enables this, and this community has defined the
language and protocols to enable this verification process to
occur.  There is the ability to generate a verified agreement.

Even in a conversation with my employer, we could potentially disagree
about my salary.  So what do we do?  We get lawyers involved
– the third party – to examine the contracts and employment agreement
to verify the information.  This can only occur within a community
that has the same language and contract law.  Verified agreement.

In the digital identity world, this quickly becomes an important aspect
of  any system that is going to replace our paper-based
systems.  Often, just as in paper-based systems, it is not enough
to communicate only the identity information.  It is often
critical to also communicate the sources of verification … some third
party that can provide verified agreement.  This, in most cases,
is also the source of that identity attribute.

Identity, Directories, and LID

I wish that I had more hours in the day.  I have been wanting to respond to an e-mail from Johannes Ernst (I swear I will!  I’m reading the LID docs again!) for weeks now … and I also wanted to reply to this post that he wrote the other day.

In his post, he comments on some of the comments that I made about
directories, and I wanted to clarify a couple of points.  He lists
three issues that I will address here:

  • LID is decentralized and does not depend on any
    directory (we’ll talk about some exciting consequences of that in a few
    weeks… stay tuned)

I am in full agreement, and my directory solution is also fully
decentralized.  Anyone that knew me at Novell during our years of
work on digitalMe knows that I was a maniac about a project out of our
labs in India called “Personal Directory.”  You can still go and download a copy
and check it out.  This is a full blown LDAP v3 directory service
that can run on your desktop.  In my perspective of how
directories can be integrated and used for identity, I do not believe
in “one big directory in the sky”, nor “a bunch of directories”, but
instead see these running everwhere.

As I started to read the LID documentation, I realized that I could
probably put an LDAP directory behind the LID protocols, and serve
information directly from the directory.  The benefit here is that
directories like this are already in use in thousands or millions of
businesses out there … so leveraging this existing base of identity
information just happens.

  • access control “down to the attribute level” is all fine, but
    unless the person owning the identity is in control, it won’t be used much
    (most directories I’ve seen are all-or-nothing things, and maintaining all
    of those rights centrally quickly becomes so expensive that few do it)

Yes!  This was one of the core benefits we were working on with
digitalMe … a way for users to manage their own identity, and also
the synchronization of their attributes – selectively – into other
personal and community directories.  The power that we were
exploiting was a standard feature of Novell’s directory implementations
… the ability to easily determine who could access/modify any object
down to the attribute level.  We then worked on automating the
process of a local agent keeping your identity information up to date
with the personal and community directories where you had defined a
relationship.

  • he doesn’t talk about how this would work across the boundaries of a
    directory, or an organization.

Hopefully, some of my explanation above reveals some of what we were
exploring.  With digitalMe, I would have my ‘personal directory’
where I would have an object representing me
to keep my own personal identity information, along with objects
representing friends, family, and associates that I have relationships
with.  Corporations or other communities would then have their own
directories containing objects representing the identities of their
members and associates … one of those objects might represent me if I
have a relationship with that entity.

As part of our redundancy and fault tolerance plans, we had also looked
to the future where I might also replicate my directory to other
computers (my home computer?) or hosted directories (a bank?) so that
there is no single point of failure or loss.

One of the areas that I really like LID, and to think about integration
with directories, is the layers of abstraction that can be
implemented.  I could easily modify the index.cgi (ok … if I had some spare time!)
so that it uses a directory to obtain the user attributes, instead of
the various vCard and FOAF xml files.  If the LID request also
passes through the credentials of the requestor, then the directory
would automatically return only the attributes visible to that
requestor.  If I still wanted the foaf.xml or vcard.xml files, I
could generate these dynamically on the fly – from the directory – as
an alternative.  In a business environment, there might already be
a directory that contains a great deal of information about me.

Overall, I really like what I see with LID … I’m going to continue
reading and maybe play with the scripts.  Maybe I’ll make the time
to do some modifications …  😉

Kim’s Fifth Law … common sense to many of us!

Kim Cameron posted his Fifth Law of Identity, and I was surprised that more people didn’t just jump in and agree. I was really surprised that Craig Burton didn’t jump for joy as the entire law parallels some of the work that Craig led at Novell years ago.

Kim’s new Law is as follows:

The Law of Pluralism:

A
universal identity system MUST channel and enable the interworking of
multiple identity technologies run by multiple identity providers.

This
reminds me of the original work at Novell on Open Protocol Technology –
OPT – which was when we began to support multiple application protocols
for file system access.

As a brief history, NetWare was a “next generation” kernel and
operating system when it was introduced to the market. For a
transport protocol, it used a variation of the Xerox XNS protocols that Novell renamed as IPX, SPX, RIP, SAP,
and others. On top of this transport (the equivilent of TCP/IP in
the Internet) was the application protocol for making file system
requests – the NetWare Core Protocol
or NCP. To simplify this, NCP can be thought of as similar to NFS
… a file access protocol. So where UNIX systems would use NFS
on a transport of TCP/IP, NetWare servers would be accessed from DOS
workstations using NCP on a transport of IPX.

The first step towards Open Protocol Technology – or a form of Pluralism – was with Novell NetWare v2 (actually it was version 2.15 in 1988!) when Novell added support for the Apple Talk Protocol Suite,
allowing Apple Macintosh computers to see a NetWare server as though it
were an Apple server. This was done by adding support for the
Apple transport protocols, and also the file protocols. So now
DOS and Windows workstations could access files on the server using
NCP/IPX, and Macintosh computers accessed the same files … using
their native tongue, the Apple File Protocol.

Soon after this, Novell added support for TCP/IP, NFS, and FTP with the
release of NetWare v3. It actually went even further when Novell
implemented the OSI protocol stack on NetWare. I still have a sealed box of NetWare FTAM which was the product where Novell implemented the FTAM file protocols on top of an OSI protocol stack!

In this example of “pluralism” Novell was able to create a product that
supported file system access via numerous transport protocols, and
numerous file access protocols. We had demonstration networks
showing where machines running DOS or Windows, along with
Macintoshes(?), and UNIX machines, were all sharing files on the
NetWare server. This was in 1989 through 1991!

If we fast forward to now this is a common feature of almost any
operating system! Even the Linux systems in use today have the
ability to mirror this type of functionality with multiple transport
protocol support, and projects like Samba, Netatalk, etc.

To me, this law is a very common sense approach to systems design and
allows for flexibility in implementations and usage. This makes
complete sense.

More Open Source foolishness

It is movements like this one that will prove to be the downfall of
Open Source.  This article references a blog post where an author
is promoting the Open Source movement to become an “anti-Windows”
movement.  This is absolute foolishness, IMHO.

When is it that people will learn the basics of success?  It is
about creating something, and making it something that is so good that
people adopt it for its own value.  The Open Source community
seems to have a large split … those who are truly pursuing “open”
software, and those who are motivated by a desire to “fight
Microsoft.”  The sad thing is that the “fight Microsoft” crowd are
doing more damage to Open Source than they could ever imagine.

To talk about an Open Source strategy that involves applications being
written to only support a particular operating system, Linux, in order
to “force” people to migrate to that operating system, will only turn
the Open Source movement into the next “lock-in” for potential
customers.  This is always a failed strategy.  If Linux is
going to succeed and be adopted, it will be due to the fact that it
delivers real value.  If people are forced to Linux to use Open
Source, then both will be impacted in a negative way.  Humans like
the opportunity to choose value … not be forced into a particular
choice.

On top of that, most advanced strategists and scientists fully
understand that operating systems are becoming a commodity with little
differentiation.  It is only the Linux groupies and extremists
that want to force the whole world to Linux.  If you look around,
there are numerous other kernels that exist – Darwin (Apple), SkyOS,
Solaris, etc. – that have nothing to do with Linux.  As an Open
Source developer, why would you want to tie your application to a
specific kernel and its future?  Don’t most developers realize
there will be other kernels in the future?  It seems that much of
the world has seen the advantages of virtual machines and
cross-platform development langauges.  Is Linux really so weak
that it can only succeed if people are forced to go to it?  I don’t think so.

Instead of turning Open Source into Anti-Microsoft Source, embrace the
real notion of Open Source and write to be completely abstracted from
the many kernels of the world.  The strongest solution will
survive and flourish!

Open Source on Windows – Boon or Bane for Linux? [Slashdot:]

Couple Linux with Zigbee

Very nice … Open Source drivers for ZigBee
This is going to even further propel the standard forward.  I
believe that the recent adoption of ZigBee by even the local companies
here in Utah – Control4 and MaxStream – coupled with projects like this
are going to generate a lot of momentum.

The Linux Wireless Sensor LAN Project 0.1. 802.15.4 Linux drivers and utilities. [freshmeat.net]