This is a really good topic to examine. There are numerous trends
that are coming up in the world, and these are the next big software
companies … they have realized the writing on the wall.
The software platforms of the future are being built as “abstractions”
above the operating systems … and far above the kernels in use.
The world is very quickly becoming one filled with interpreted
languages, scripting languages, and virtual machines. Hardware is
becoming so powerful and so cheap that the compute inefficiences are
quickly masked, and “good enough” performance emerges. This is
even further demonstrated by the rapid growth in hardware “emulators”
or “virtualizers” like the VMware, Qemu, and Virtual PC solutions.
On top of this, we are actually watching the entire computer software
industry converge towards a complete “UNIX-compatible” set of APIs and
development tools. NetWare? Novell is moving to
Linux. Macintosh? It’s now based on Darwin.
Windows? Even they have SFU (Services for UNIX) that supports all
the same applications. UNIX-compatible software is what the
market is actually embracing … not “Linux” or any specific version of
Most people do not realize that the majority of any “Linux”
distribution is actually a wealth of GNU tools and UNIX-compatible
software. In my research only ~3% of any distribution is “Linux”
itself … the rest is all of the common libraries, languages and
applications that we all hear about – Apache, MySQL, Perl, PHP, C#,
Java, Gnome, gcc, X Windows etc. – and all of those are
“UNIX-compatible” applications and services or have versions that run
on UNIX-compatible kernels.
If the world is now going to see a mass commoditization of the kernels
… with complete compatibility around a common set of development
tools, then the real play – that SourceLabs is pursuing – becomes the
certification and support of the wealth of Open Source software.
And if I were going to do it, I would ensure that I could provide all
of this software across *ALL* of the UNIX-compatible kernels in
existence … or at least the core four for now – Linux, Darwin,
FreeBSD, and Windows/SFU.
With a strategy to become the de facto provider of software across all
of these platforms, you would be able to provide the solutions to your
customers and not care about kernels. If there are problems with
one … you can move them to a different kernel without issues.
If they are a Windows shop, you get them to adopt SFU and
UNIX-compatible applications as a “migration preparedness”
strategy. If they are a Linux shop … you are ready to move them
if the legal issues take a turn for the worse.
In all cases, you are setting yourself as the optimal software
development solution … being paid to maintain and enhance the
software that truly hits the users and customers … while further
commoditizing and making irrelevant the kernels and low-level code that
everything runs on.
In addition, if you play this right you are able to take over Open
Source projects, and demand the copyrights be signed over for all
contributions … allowing you to further control the “fork” of
software that you are driving forward. Yes … people could
attempt to fork a project in a new direction, however it takes a lot of
effort … and if that occurs, you still charge your customers to
support and maintain the new fork … win … win … win.
So when I see this announcement, I have to say that I see this as being
the real win in Open Source and the current market craze.
Companies like IBM are well placed to capitalize on this also … and
you see that even the big IBM does not have a “distribution” of Linux
they sell … they are moving on beyond the lower layers, and up to the
applications. Companies like this are going to be well positioned
to allow their customers to take advantage of the newest kernels …
and move away safely from those that can not survive … or to ones
that are cheaper or free …
The abstraction of the operating system is well under way … and this
is the birth of a new business that is doing to the operating system,
what operating systems did to the processor.