It’s fun to see the attention that Microsoft is getting lately … all
based on the rumors of the coming Identity solution. I saw this
reference tonight on the CNN web site.
I’m hoping to see some of the developer SDK stuff soon …
It’s fun to see the attention that Microsoft is getting lately … all
based on the rumors of the coming Identity solution. I saw this
reference tonight on the CNN web site.
I’m hoping to see some of the developer SDK stuff soon …
Tonight is a news-reading and e-mail-reading evening. I’m way
behind on my reading and responding. I’ve been way too busy with
a new job, and I’ve been on the road. At the beginning of this
week, however, I was in a programming class and I learned C#. I’m
now moving all of my development to this new cross-platform language.
All of what I learned this week was in Microsoft Visual Studio. I
can not say enough about how impressed I am with the complete Microsoft
development environment. The creators of this development solution
ought to be proud of what they have created.
I am also downloading and installing all of the latest Mono tools to
begin the process of developing C# on Linux. I am looking
forward to tracking the progress of the Mono project, and all of the
various components. What I really like is that C# and the support
behind it appears to be a new language – and complete application
deployment platform – that will deliver where Java seemed to
stumble. C# is now being actively and completely supported on the
two biggest platforms on earth – Windows as the largest installed base
of machines, and Linux as the rapidly growing contender. No JVM
to download and install … no strange looking User Interface.
Anyhow … slightly off-topic … but I wanted to comment on
this. I have to admit that I see C# as a big deal in the next
decade!
When
I was reading my aggregator the last day of eTech, I found these
posts in my page of new articles. I started to wonder “How the
heck is my aggregator
going crazy? What is going on here? I’m not doing this!”
… and then I realized what was up. At eTech, all of the
attendees were on the wireless network behind a NAT. To Slashdot,
it must have looked like a lot of requests for their RSS feed from the
same address. Slashdot thought this was all traffic coming from a
single user … and so they pitched the error messages out.
It’s funny to see yet another way in which technology confuses
technology. I’m not sure how this was solved … someone must
have contacted Slashdot to let them know. To Slashdot, they only
saw the one “identity” and assumed that it was a single user hammering
their servers. Yet another case where some sort of solution could
be developed to encode identity into the RSS request.
Funny …
Cool. On Monday I’m heading down to the O’Reilly Emerging Technology Conference. I am really looking forward to this. Not only is there an awesome line up of speakers, there are going to be a lot of very cool people ot talk with, and brainstorm with.
I’ll be blogging the conference as I figure they will have wireless everywhere!
While reading Ken Novak‘s weblog, I found his post about SkypeCasting.
I love it! This is a cool idea … and continues to make me think
about the future that we are quickly approaching.
I once heard a good quote that was something like “Privacy in the
future will be the equivalent of living in a nudist colony.
People who are uncomfortable being naked will be very uncomfortable in
the future.” The gist of this statement is that we are quickly
approaching the “Transparent Society”
that David Brin explored in his book. In this possible future,
there will be little that we can do about having our every move
observed, recorded, and/or reported on. So what does this have to
do with SkypeCasting?
With wireless Internet everywhere, smaller and smaller laptops and
computers, and software like Skype providing VoIP capabilites, the
ability to “bug” almost any event or conversation increases. Add
video to this, and our ability to remotely observe and listen to almost
anything is extended. What got me thinking about this was this
comment:
“I’d happily pay $5 to hear the music from my favorite jazz club when I can’t
make it; and I’d like to listen in on community or political meetings when I
can’t be there”
Paying for such a live feed is a reasonable thing to think about.
The real issue will be that anyone in the audience of any event can
become a *free* live feed of that event. In addition, anyone
walking around that is near you can become a live feed of you and your
activities. This will create an interesting form of verifying
your identity and reputation … in near real time.
I know that these are some older documents, however I had not seen these docs on Microsoft Identity and Access Management
before. I found them linked from another web site I was
reading. There is some interesting stuff . From their site:
This series of papers provides numerous identity and access management
concepts, techniques, and solutions for use in heterogeneous IT
environments.
Identity and access management combines processes, technologies, and
policies to manage digital identities and specify how they are used to access
resources.
I’m reading through the package for some background.
Funny what you find on the net! While reading through some links related to wearable computer research I cam across this great page with some thoughts by Ana Viseu
about “bodynets” and Identity. Besides that fact that I really
like the look of the web site, I like this train of thought:
Identity, loosely defined as the way we see and present ourselves, is
not static. On the contrary, identity is primarily established in social
interaction. This interaction consists, in its most basic form, of an
exchange of information. In this information exchange individuals define
the images of themselves and of others. This interaction can be mediated-through
a technology, for example-and it can involve entities of all sorts,
e.g., an institution or a technology. I am investigating this interaction
through the study of bodynets.
Bodynets can be thought of as new bridges or interfaces between the
individual and the environment. My working definition of a bodynets
is: A body networked for (potentially) continuous communication with
the environment (humans or computers) through at least one wearable
device-a computer worn on the body that is always on, ready and accessible.
This working definition excludes implants, genetic alterations, dedicated
devices and all other devices that are portable but not wearable, such
as cell phones, smart cards or PDAs.
Besides the matters related to identity, bodynets also raise serious
issues concerning privacy, which in turn feedback on identity changes.
Bodynets are composed of digital technologies, which inherently possess
tracking capabilities, this has major privacy implications.
If you like this, continue reading … there is a lot of additional material. Whenever I see the University of Toronto, I have to guess that Steve Mann is involved. These are all important directions to look at.
I can completely understand the natural human tendency to want to claim
ownership of our identity. I constantly see the statements “It’s mine! I want to control it! I want to determine who can see it!”
After years of looking at this space, however, I have become convinced
that our identity, as we know it, is already spread across the
communities that gave us that identity. That the basis of the First and Second Axioms that I posited. I see identity as an accumulated thing … and it only exists with language. It is language that allows us to distinguish identity.
It is this train of thought that takes me to the Fourth Axiom of Identity:
I posit that for an
effective community to exist there must be verified agreement, which
requires a minimum of three community members.
As I stated above, I believe it is language that allows us to
distinguish identity. Language requires agreement on the meaning
of words. The only way to have verified agreement is to have a
third party as a “tiebreaker.” This does not prevent
disagreement, however it allows for the verification of a previous
agreement.
Where I see this relating to identity is in the area of verification, or authentication
of identity attributes. Here, I mean authentication as in
“verifying the authenticity of.” I can always choose to accept
the identity information that I receive from another entity, however
the value and accuracy of that identity information is always in
question until I am able to verify it. How I verify identity
information will always occur using some other member of a common
community that is able to provide the verification.
When applying for a financial loan, I can tell a bank about my employer
and salary information, however they are probably going to verify the
information with my employer. All three of us exist within a
common community that enables this, and this community has defined the
language and protocols to enable this verification process to
occur. There is the ability to generate a verified agreement.
Even in a conversation with my employer, we could potentially disagree
about my salary. So what do we do? We get lawyers involved
– the third party – to examine the contracts and employment agreement
to verify the information. This can only occur within a community
that has the same language and contract law. Verified agreement.
In the digital identity world, this quickly becomes an important aspect
of any system that is going to replace our paper-based
systems. Often, just as in paper-based systems, it is not enough
to communicate only the identity information. It is often
critical to also communicate the sources of verification … some third
party that can provide verified agreement. This, in most cases,
is also the source of that identity attribute.
I wish that I had more hours in the day. I have been wanting to respond to an e-mail from Johannes Ernst (I swear I will! I’m reading the LID docs again!) for weeks now … and I also wanted to reply to this post that he wrote the other day.
In his post, he comments on some of the comments that I made about
directories, and I wanted to clarify a couple of points. He lists
three issues that I will address here:
I am in full agreement, and my directory solution is also fully
decentralized. Anyone that knew me at Novell during our years of
work on digitalMe knows that I was a maniac about a project out of our
labs in India called “Personal Directory.” You can still go and download a copy
and check it out. This is a full blown LDAP v3 directory service
that can run on your desktop. In my perspective of how
directories can be integrated and used for identity, I do not believe
in “one big directory in the sky”, nor “a bunch of directories”, but
instead see these running everwhere.
As I started to read the LID documentation, I realized that I could
probably put an LDAP directory behind the LID protocols, and serve
information directly from the directory. The benefit here is that
directories like this are already in use in thousands or millions of
businesses out there … so leveraging this existing base of identity
information just happens.
Yes! This was one of the core benefits we were working on with
digitalMe … a way for users to manage their own identity, and also
the synchronization of their attributes – selectively – into other
personal and community directories. The power that we were
exploiting was a standard feature of Novell’s directory implementations
… the ability to easily determine who could access/modify any object
down to the attribute level. We then worked on automating the
process of a local agent keeping your identity information up to date
with the personal and community directories where you had defined a
relationship.
Hopefully, some of my explanation above reveals some of what we were
exploring. With digitalMe, I would have my ‘personal directory’
where I would have an object representing me
to keep my own personal identity information, along with objects
representing friends, family, and associates that I have relationships
with. Corporations or other communities would then have their own
directories containing objects representing the identities of their
members and associates … one of those objects might represent me if I
have a relationship with that entity.
As part of our redundancy and fault tolerance plans, we had also looked
to the future where I might also replicate my directory to other
computers (my home computer?) or hosted directories (a bank?) so that
there is no single point of failure or loss.
One of the areas that I really like LID, and to think about integration
with directories, is the layers of abstraction that can be
implemented. I could easily modify the index.cgi (ok … if I had some spare time!)
so that it uses a directory to obtain the user attributes, instead of
the various vCard and FOAF xml files. If the LID request also
passes through the credentials of the requestor, then the directory
would automatically return only the attributes visible to that
requestor. If I still wanted the foaf.xml or vcard.xml files, I
could generate these dynamically on the fly – from the directory – as
an alternative. In a business environment, there might already be
a directory that contains a great deal of information about me.
Overall, I really like what I see with LID … I’m going to continue
reading and maybe play with the scripts. Maybe I’ll make the time
to do some modifications … 😉
Kim Cameron posted his Fifth Law of Identity, and I was surprised that more people didn’t just jump in and agree. I was really surprised that Craig Burton didn’t jump for joy as the entire law parallels some of the work that Craig led at Novell years ago.
Kim’s new Law is as follows:
The Law of Pluralism:
A
universal identity system MUST channel and enable the interworking of
multiple identity technologies run by multiple identity providers.
This
reminds me of the original work at Novell on Open Protocol Technology –
OPT – which was when we began to support multiple application protocols
for file system access.
As a brief history, NetWare was a “next generation” kernel and
operating system when it was introduced to the market. For a
transport protocol, it used a variation of the Xerox XNS protocols that Novell renamed as IPX, SPX, RIP, SAP,
and others. On top of this transport (the equivilent of TCP/IP in
the Internet) was the application protocol for making file system
requests – the NetWare Core Protocol
or NCP. To simplify this, NCP can be thought of as similar to NFS
… a file access protocol. So where UNIX systems would use NFS
on a transport of TCP/IP, NetWare servers would be accessed from DOS
workstations using NCP on a transport of IPX.
The first step towards Open Protocol Technology – or a form of Pluralism – was with Novell NetWare v2 (actually it was version 2.15 in 1988!) when Novell added support for the Apple Talk Protocol Suite,
allowing Apple Macintosh computers to see a NetWare server as though it
were an Apple server. This was done by adding support for the
Apple transport protocols, and also the file protocols. So now
DOS and Windows workstations could access files on the server using
NCP/IPX, and Macintosh computers accessed the same files … using
their native tongue, the Apple File Protocol.
Soon after this, Novell added support for TCP/IP, NFS, and FTP with the
release of NetWare v3. It actually went even further when Novell
implemented the OSI protocol stack on NetWare. I still have a sealed box of NetWare FTAM which was the product where Novell implemented the FTAM file protocols on top of an OSI protocol stack!
In this example of “pluralism” Novell was able to create a product that
supported file system access via numerous transport protocols, and
numerous file access protocols. We had demonstration networks
showing where machines running DOS or Windows, along with
Macintoshes(?), and UNIX machines, were all sharing files on the
NetWare server. This was in 1989 through 1991!
If we fast forward to now this is a common feature of almost any
operating system! Even the Linux systems in use today have the
ability to mirror this type of functionality with multiple transport
protocol support, and projects like Samba, Netatalk, etc.
To me, this law is a very common sense approach to systems design and
allows for flexibility in implementations and usage. This makes
complete sense.