I just installed the tool MasterPing that I found through the technorati
site. It is designed to provide the blog update “pings” to
various sites on the Internet. It’s cool to find tools like this
for Radio … I just wish more people would buy a copy of Radio so that
the development could progress faster. It really is a cool tool
…
Monthly Archives: January 2005
Kim’s Fifth Law … common sense to many of us!
Kim Cameron posted his Fifth Law of Identity, and I was surprised that more people didn’t just jump in and agree. I was really surprised that Craig Burton didn’t jump for joy as the entire law parallels some of the work that Craig led at Novell years ago.
Kim’s new Law is as follows:
The Law of Pluralism:
A
universal identity system MUST channel and enable the interworking of
multiple identity technologies run by multiple identity providers.
This
reminds me of the original work at Novell on Open Protocol Technology –
OPT – which was when we began to support multiple application protocols
for file system access.
As a brief history, NetWare was a “next generation” kernel and
operating system when it was introduced to the market. For a
transport protocol, it used a variation of the Xerox XNS protocols that Novell renamed as IPX, SPX, RIP, SAP,
and others. On top of this transport (the equivilent of TCP/IP in
the Internet) was the application protocol for making file system
requests – the NetWare Core Protocol
or NCP. To simplify this, NCP can be thought of as similar to NFS
… a file access protocol. So where UNIX systems would use NFS
on a transport of TCP/IP, NetWare servers would be accessed from DOS
workstations using NCP on a transport of IPX.
The first step towards Open Protocol Technology – or a form of Pluralism – was with Novell NetWare v2 (actually it was version 2.15 in 1988!) when Novell added support for the Apple Talk Protocol Suite,
allowing Apple Macintosh computers to see a NetWare server as though it
were an Apple server. This was done by adding support for the
Apple transport protocols, and also the file protocols. So now
DOS and Windows workstations could access files on the server using
NCP/IPX, and Macintosh computers accessed the same files … using
their native tongue, the Apple File Protocol.
Soon after this, Novell added support for TCP/IP, NFS, and FTP with the
release of NetWare v3. It actually went even further when Novell
implemented the OSI protocol stack on NetWare. I still have a sealed box of NetWare FTAM which was the product where Novell implemented the FTAM file protocols on top of an OSI protocol stack!
In this example of “pluralism” Novell was able to create a product that
supported file system access via numerous transport protocols, and
numerous file access protocols. We had demonstration networks
showing where machines running DOS or Windows, along with
Macintoshes(?), and UNIX machines, were all sharing files on the
NetWare server. This was in 1989 through 1991!
If we fast forward to now this is a common feature of almost any
operating system! Even the Linux systems in use today have the
ability to mirror this type of functionality with multiple transport
protocol support, and projects like Samba, Netatalk, etc.
To me, this law is a very common sense approach to systems design and
allows for flexibility in implementations and usage. This makes
complete sense.
More Open Source foolishness
It is movements like this one that will prove to be the downfall of
Open Source. This article references a blog post where an author
is promoting the Open Source movement to become an “anti-Windows”
movement. This is absolute foolishness, IMHO.
When is it that people will learn the basics of success? It is
about creating something, and making it something that is so good that
people adopt it for its own value. The Open Source community
seems to have a large split … those who are truly pursuing “open”
software, and those who are motivated by a desire to “fight
Microsoft.” The sad thing is that the “fight Microsoft” crowd are
doing more damage to Open Source than they could ever imagine.
To talk about an Open Source strategy that involves applications being
written to only support a particular operating system, Linux, in order
to “force” people to migrate to that operating system, will only turn
the Open Source movement into the next “lock-in” for potential
customers. This is always a failed strategy. If Linux is
going to succeed and be adopted, it will be due to the fact that it
delivers real value. If people are forced to Linux to use Open
Source, then both will be impacted in a negative way. Humans like
the opportunity to choose value … not be forced into a particular
choice.
On top of that, most advanced strategists and scientists fully
understand that operating systems are becoming a commodity with little
differentiation. It is only the Linux groupies and extremists
that want to force the whole world to Linux. If you look around,
there are numerous other kernels that exist – Darwin (Apple), SkyOS,
Solaris, etc. – that have nothing to do with Linux. As an Open
Source developer, why would you want to tie your application to a
specific kernel and its future? Don’t most developers realize
there will be other kernels in the future? It seems that much of
the world has seen the advantages of virtual machines and
cross-platform development langauges. Is Linux really so weak
that it can only succeed if people are forced to go to it? I don’t think so.
Instead of turning Open Source into Anti-Microsoft Source, embrace the
real notion of Open Source and write to be completely abstracted from
the many kernels of the world. The strongest solution will
survive and flourish!
Open Source on Windows – Boon or Bane for Linux? [Slashdot:]
Couple Linux with Zigbee
Very nice … Open Source drivers for ZigBee?
This is going to even further propel the standard forward. I
believe that the recent adoption of ZigBee by even the local companies
here in Utah – Control4 and MaxStream – coupled with projects like this
are going to generate a lot of momentum.
The Linux Wireless Sensor LAN Project 0.1. 802.15.4 Linux drivers and utilities. [freshmeat.net]
Directory technolgies and Identity Management
I saw that Mark Wahl and Kim Cameron have been talking
a while back, and I like to see that. With Mark’s background in
LDAP directory technologies, I know that he has been thinking about
this space for a long time.
When working on digitalMe at Novell, I really wanted to see directory
technologies extended to become a primary platform for Identity.
When I say “extended” it was because there are a lot of issues that we
found when attempting to store identity in a directory.
There are numerous reasons that a directory is a logical place to store identity:
- fully extensible schema for objects and attributes
- authentication for verifying the user
- access control down to the attribute level
- flexible multi-protocol access
An extensible schema gave us
the ability to quickly create a core identity representation. A
“user” object with a list of attributes. What was powerful here
was when a user interacted with a new entity and was prompted for some
previously “unknown” attribute. This would be some attribute that
might be common, but had not been pre-defined in our list of user
attributes. With a directory we were able to ask the user for the
value of that attribute, extend the schema, and populate the
value. In a later iteration, we also looked for a way to allow
the user to “alias” the new attribute to simply point at an existing
attribute. This is the case where some attribute is called
different things by different communities.
With directory authentication, we are able to verify who is talking to the directory and then enforce access control on that connection.
Access control, down to the
attribute level, was one of the most powerful features that a directory
provides. With this, we were able to determine the “visibility”
of any particular identity attribute to any requestor. With most
directories there is even the concept of a “public” or “anonymous” user
making requests, and so we were able to expose those attributes that
are considered “public.” This is also what allows me to expose
more information to people that I want to. These access controls
could also determine who was able to modify any attribute of any
object. So, for example, I might have an object that represents you in my directory, and I might choose to allow you to update and maintain some of your own attributes. It is important to see that I might choose to
… because I also might choose not to allow you to update your
object. After all … it’s my directory. However if I trust
you, why not allow you to keep me up to date on your identity if you
want to?
Lastly, it is multi-protocol
access that offered the ability to integrate with a wide range of
identity solutions. At Novell we had internal proprietary
protocols, LDAP, and even some HTTP/HTML/XML methods of access. I
worked on a protocol that I called XDAP just before we announced
digitalMe. It is almost a LID/FOAF parallel. What I did was to have XML data returned – in DSML format – when a request was received in the IETF RFC 2255
format. Even after leaving Novell I had a lot of fun
experimenting with this further and using CSS and XSL to then directly
render identity information as “documents” in the browser that looked
just like the “real” documents in the paper world.
Over all … I believe that directories could be one of the possible
stores for identity information. There are, however, some
limitations in their implementations that don’t allow for many of the
common identity request patterns … like versioning and timestamping
of attributes. Directories are not very well designed to account
for how our identities evolve and change over time. I believe
this is necessary to have effective identity management.
Representation of observable identity
I really like this post by Kim about Carl’s notions of identity.
I really agree with this, and like the overall direction, however I
believe that there is a big thing missing.
In my opinion, there has to be a larger circle that encompasses this
entire diagram that represents the community that all three of these
entities belong to!
What community? Well … there has to be some community that
exists since there is some common language, or form of representation,
that each of these entities is using to refer to the
observations. In fact, I do not believe that there is a way that
dinstinctions of identity attributes can exist outside of the context
of some language. Words to describe the uniqueness of the
attribute … to measure and quantify it.
I was reminded of this tonight while talking with my two and half year
old son, Sam. He has long know the meme “ball” which started with
watching the kids playing basket ball ouside. He then assumed
that all balls are a “b-ball” … what he would call them. One
evening while driving home, he saw the moon in the sky and yelled
“Daddy … ball!” pointing at the bright disc. I told him “moon”
and he replied “moon ball.” The birth of a new distinction.
Tonight he saw someone on TV playing with a globe that they took off of
its stand. He said “Ball!” and I replied “globe” … he then
replied “globe ball.”
As we are both making these observations and distinctions, we are only
able to refer to the identity of these objects through a common
language … and common community. This entire ability is an
accumulated experience that we all seem to forget … it is a product
of the community that we grow up and live within.
There ought to be a big circle behind the entire image … and it will
represent the community through which the three of these entities have
come into contact … and the one that give them the ability to express
their observations.
Ski Utah!
Today I was able to get out skiing at park City Ski Resort with my
cousin Brian. It was a great day, and the snow and weather just
worked. On top of that, I got to catch up with my cousin and hear
more about what he has been doing. Brian and his wife have a
architectural firm called Dake Wilson Architects and they are building some amazing homes and buildings. They came out for the Sundance Film Festival.
While they were here, I learned that Brian not only did the designs for
the Puma Stores, but he also worked on Bill Joy’s home in Aspen.
It’s also fun to search Google for relatives! I found that Renee
also wrote a brief article about their Eco-Home in LA … they are great people and it was good to see them.
Internet has room to grow …
This is a good article that talks about the next generation of speed
tests for transferring information. The article from
ComputerWorld shows that the research into DWDM and other types of
modulation are still progressing.
Wow! That’s fast TCP!.
Data has been sent across a wide-area optical network at 101Gbit/sec.,
the fastest-ever sustained data transmission speed, equivalent to
downloading three full DVD movies per second, or transmitting all of
the content of the Library of
Congress in 1… [KurzweilAI.net Accelerating Intelligence News]
Is there such a thing as ‘public’ and ‘private’?
I want to start off by saying that I am in agreement with Kim’s Fourth Law of Identity … however it did get me thinking about ‘public’ and ‘private’ … ‘omnidirectional’ and ‘unidirectional’ …
The Fourth Law of Identity
The Law of Directed Identity
A universal identity system MUST support both “omnidirectional”
identifiers for use by public entities and “unidirectional” identifiers
for use by private entities, thus facilitating discovery while
preventing unnecessary release of correlation handles.
First, when I think about identity, I now believe that a ‘public’
identity is really just a ‘default’ identity. This is what we are
willing to expose to anyone, anyplace, and at any time. If I look
at the ‘real world’, we have certain characteristics and behaviors that
we are willing to expose when we go out in public. We then might
meet up with someone else, and choose to exchange other information
‘privately’, however we actually reveal something about ourselves even
when we perform a ‘private’ exchange of information. Kim stated:
Entities that are public can have identitifiers that are
invariant and well-known. These identifiers can be thought of as
beacons, emitting identity to anyone who shows up – and thus being in
essence “omnidirectional” (they are willing to reveal their existence
to the set of all other identities).
I agree with this … for any provider of good or services to be
known, they must expose some sort of information to be
discovered. It could be that the entity might choose to ’emit
this beacon’ all of the time … or maybe to sit quietly waiting for
the detection of another entity. In either case, once the
‘omnidirectional beacon’ has been emitted, there is a way to reference
the source entity.
What I like is the second example:
A second example of such a public entity is the “polycomm”
which looms large in the scenario we chose as a backdrop to the present
discussion. The polycomm sits in a conference room in an enterprise.
Visitors to the conference room can see the polycomm and it offers
digital services by advertising itself to those who come near it. In
the thinking outlined here, it has an omni-directional identity.
This is no really big deal … it makes common sense … however:
Similarly, when entering a conference room furnished with
a polycomm, the omnidirectional identity beacon of that polycomm can be
used by the owner of a cell phone to decide whether she wants to
interact with it. If she does, a short-lived “unidirectional” identity
relation can be created between the cell phone and the polycomm – and
used to disclose a single music preference without associating that
preference with any long-lived identity whatsoever.
I’m not so sure that this is truly ‘unidirectional’ since
there are other artifacts of the ‘short-lived unidirectional identity
relation’ that could be observed. I might not be able to
determine the exact details of what is transferred, however I could
easily – with the assistance of some others in the room – triangulate
on the source of the signal and locate the owner of the cell
phone. I could then couple this with other visible or audible
information to begin the process of compiling a profile of that
person. So is this ‘private’?
Of course the owner of the cell phone could also
collaborate with others in the room to all initiate communications with
the polycomm at the same time, and the polycomm could be configured to
add random timings to assist with masking the true source of the music
preference, however this then still potentially identifies the ‘crowd’
or ‘community’ that is the source of the communications.
When I was working on digitalMe, I followed the work of the AT&T “Crowds” project … and also the Lucent Personal Web Assistant project.
Both of these convinced me that there might not really be a way to be
truly “private” … and that the best we can hope for is to hide in a
crowd.
Great … Radio seems to be working again …
Well … after that last series of posts … it seems that Radio is behaving again …