Boiling a Google …

I had to write one other follow up post about Google and twitter.

When I read articles like this one “Google CEO: Twitter A ‘Poor Man’s Email System’ (GOOG)” where Eric Schmidt is quoted as saying:

“Speaking as a computer scientist, I view all of these as sort of poor man’s email systems,”

… I am reminded the story about boiling a frog.  If you throw a frog in boiling water, it’ll try to leap out immediately.  If you put a frog in the pot of luke-warm water, and then turn the heat on high … the frog will enjoy the nice warm swim, and eventually be boiled to death as the temperature increases.

The comment that Eric made, in my opinion, demonstrates that the Google-frog is swimming around in heating water … oblivious to what is going on around it when it comes to twitter.

In a follow-up article “Google’s Schmidt: I Didn’t Diss Twitter (GOOG)” I believe that Eric continues to miss the boat on twitter:

IN CONTEXT IF YOU READ WHAT I SAID, I WAS TALKING ABOUT THE FACT THAT COMMUNICATION SYSTEMS ARE NOT GOING TO BE SEPARATE. THEY’RE ALL GOING TO BECOME INTERMIXED IN VARIOUS WAYS. PEOPLE WILL USE EMAIL, THEY WILL WANT TO USE TWITTER FACEBOOK,  THEY WILL WANT TO USE THE OTHER FORMS.

Uh … I would argue that it’s not about mixed forms of communications.  It’s all about search, and reputation-based search and endorsements.

I’d actually love to hear Eric talk about the power of twitter, and what they are doing where Google has failed.  How many people remember Dodgeball?  An incredible text service that started before twitter … had lots of potential … they were bought by Google and crushed internally.  FAIL.

Google … time to get out of the water.  But then again … it might actually be too late.

twitter – combining search … and reputation networks!

When I read this article by Michael Arrington today I had to come out of blog hibernation! He is hitting is right on target … twitter is beginning to benefit from emergent properties of the mass of communications passing through it’s network. I am a firm believer that anytime to get massive amounts of interaction, there will be completely un-anticipated emergent properties … and twitter is reaching that inflection point. It’s no longer about twitter … but what is now beginning to appear within twitter … and what can now be gleaned from the masses of tweets.  Michael explains the real value that is beginning to emerge … ratings and rankings of people and products … via twitter … in near real-time.  All updated from your mobile device where ever you are.

At a dinner tonight with a friend the conversation turned to Twitter. He just didn’t get it, and he’s certainly not the first person to tell me that. Specifically, my friend didn’t understand the massive valuation ($250 million or more) that Twitter won in its recent funding. I told him why I thought it was more than justified: Twitter is, more than anything, a search engine.

I told him what I thought of Twitter as a micro-blogging service: it’s a collection of emotional grunts. But it’s wonderful nonetheless. And enough people are hooked on it that Twitter has reached critical mass. If something big is going on in the world, you can get information about it from Twitter.

Twitter also gathers other information, like people’s experiences with products and services as they interact with them. A couple of months ago, for example, I was stuck in the airport and received extremely poor service from Lufthansa. I twittered my displeasure, which made me feel better – at least I was doing something besides wait in an endless line. I’ve also Twittered complaints about the W Hotel (no Internet, cold room) and Comcast (the usual Internet gripes).

Last week I read a great piece from John Borthwick called “Google Next Victim of Creative Destruction? (GOOG)” and I couldn’t agree with him more. Although Google currently rules the search world, I would argue that it is only one kind of search, and that although their page-rank has got them a long ways … there are too many way to game the system. Google is holding control of the “authority” of what pages THEY feel are to appear on the home page. I spoke the other day with a friend who has a system capable of generating thousands of SEO optimized content pages … all ready to be consumed and capture the clicks of users searching on Google … all to drive them to paying clients who want the traffic and leads. The Google system is built prior to the explosion of social networks … and there lies the problem for them … and the real value in next generation search.

The social networks become a way to filter and choose to trust – or not – based on the reputation of the person who created the content!  And as Michael states … using twitter to search for that information now allows you to easily “consider the source”.  In fact, I believe that we are going to see more and more reputation networks develop within twitter … and, oh yeah, I’m working on one – TopFollowFriday.com!

At first thought people might think that looking at the twitter “friends” and “followers” to determine popularity, but these two counts are now almost completely useless, and merely define a network of message flow.  Any attempt to rate or rank based on these counts is really a joke … purely implicit assumptions about why the users network has that many friends and followers.  In my opinion, those two counts, and the users network merely provide the foundation for the creation of a true, explicit reputation network.

My real revelation came from Forrester Analyst Jeremiah Owyang when he commented – via a tweet – that it would be very interesting to see who were the most recommended people on #followfriday … a growing twitter ad-hoc event that occurs each Friday.  Started by Micah Baldwin, #followfriday is a hashtag that can be included in a tweet to endorse people you think are worthy of following on twitter. I immediately realized that I could leverage some twitter code that I was currently writing to quickly build a site that would track #followfriday, and then record the endorsements … allowing anyone to then explore the graph of endorsements.  TopFollowFriday.com has now transformed #followfriday from being a stream of tweets – which can total tens of thousands of individual messages on any Friday – into a site where you can now see who really endorses who and explore the network … finding new people to follow at any time!  No need to follow the #followfriday tweet stream … just come to TopFollowFriday.com and browse and search.

Now … what is important to realize is that the engine that I have written behind TopFollowFriday.com is not about #followfriday at all … it is the creation of a reputation network.  Explicit endorsements of people, by people.  Right now these are general endorsements, and are related to twitter and #followfriday … but I am in teh midst of evolving this into a completely manageable reputation system … allowing you to not only add, but remove endorsements, and even more … stay tuned!

twitter is rapidly providing a new platform for search, and when combined with reputation networks it will allow each of us to quickly and easily voice our opinion about anything … and allow those in our social networks to search for – and find – those opinions and endorsements completely within the socials networks that we trust … our specific friends and co-workers who’s opinions we trust.

Installing CentOS v5.2 in Virtual PC 2007

I am still a big fan of Microsoft Virtual PC 2007 (VPC) as a solution for experimenting with various operating systems.  If a machine is running Windows, you can go and download VPC 2007 for free … and then simply create a virtual machine, and run the OS of your choice in a window.

Well … almost.  The issue comes down to compatibility with the “virtual” hardware.  Lately, I have found that many of the Linux distributions make no effort to ensure that their releases install easily in VPC.  I’ll address that in another post.

Today I again wanted to test a new distro – CentOS v5.2 – in VPC and when I started the graphical install I was met with the same old issues. Immediately I get a completely distorted graphical screen … which is one of the most common issues.  It turns out that for host memory considerations, the “virtual video card” in VPC is limited to 8MB of video memory.  This time I chose to find a better, easier solution … and I did.

Bad Video Card settings in CentOS Install

Bad Video Card settings in CentOS Install

Poking around Google I was able to find this post that gave me the answer.  It turns out that there is a kernel parameter that can be set to force the limit on the size of the VESA video frame buffer.

vesafb=vtotal:8M

I gave it a try, and it worked!  Of course, once I got through that, I got to the next most common issue that I hit … the mouse wouldn’t work.  Back to Google to find that fix again.  I found it here.  There are two simple parameters to add … one for the mouse, and the second to have the wheel work.

i8042.noloop psmouse.proto=imps

The first part fixes the mouse … the second is for the wheel.

So how do you use all of this information?  It’s really simple.  When you boot the CD or DVD to install CentOS v5.2, you’ll get to this screen:

CentOS Install Menu

CentOS Install Menu

Once you get here, simply type:

linux vesafb=vtotal:8M i8042.noloop psmouse.proto=imps

Hit Enter and you are off and going!

Now, when the installation was complete and I rebooted I was surprised to find that the video was still working.  I didn’t have to do anything else.  BUT … the mouse was again not working.  The trick is to interrupt the GRUB boot loader, and edit the settings of the kernel line.  So when the system boots, and gives you chance to “Press any key to enter the menu” … hit any key!

You’ll then be presented with the GRUB menu, and you can then:

  1. Hit ‘e’ to edit the highlighted CentOS entry
  2. Arrow down to the lines that starts with “kernel” and hit ‘e’ again
  3. Add the following to the end of the line:  i8042.noloop psmouse.proto=imps
  4. Hit enter
  5. Type ‘b’ for boot

That will get you booted and running.

To make the change permanent, you’ll have to edit the grub.conf file as follows:

  1. Login as root (or you can use sudo if you have set up teh sudoers file.)
  2. Open grub.conf in an editor of your choice (e.g. nano /etc/grub.conf)
  3. Once again look for that “kernel” line … go to the end of the line
  4. Add the following to the end:  i8042.noloop psmouse.proto=imps
  5. Save the file and exit.

After all of this, I have a working CentOS v5.2 in Virtual PC 2007.  Well … except that I have no audio working.  It now appears that CentOS v5.2 does not include the “snd-sb16” sound card driver.  I found that I can get the sources to build the drivers from the alsa-project.org website … but that is something that will occur on another day.  🙂

Supercomputers and Solar Efficiency …

The SunI happened to catch this article today while reading on the net.  To me, this is truly impressive in two core ways:

New solar cell material achieves almost 100% efficiency, could solve world-wide energy problems

Columbus (OH) – Researchers at Ohio State University have accidentally discovered a new solar cell material capable of absorbing all of the sun’s visible light energy. The material is comprised of a hybrid of plastics, molybdenum and titanium. The team discovered it not only fluoresces (as most solar cells do), but also phosphoresces. Electrons in a phosphorescent state remain at a place where they can be “siphoned off” as electricity over 7 million times longer than those generated in a fluorescent state. This combination of materials also utilizes the entire visible spectrum of light energy, translating into a theoretical potential of almost 100% efficiency. Commercial products are still years away, but this foundational work may well pave the way for a truly renewable form of clean, global energy.

The first thing about this that I find impressive is the use of supercomputers to solve these problems.  Our advances in raw compute power are often talked about, but what is the real value delivered to us by all of this compute power?  Well … massively efficient solar power would be a heck of a reward.  From the article:

Supercomputers are enabling an entire new area of materials. No longer do scientists have to physically create samples of every possible material in the lab, only to test and document everything they find about it. Today they can set up a series of parameters and instruct a supercomputing machine to find the one that best aligns with their desires, wants and wishes. And while such computations often takes many days or even weeks for each trial material, it’s more economical and feasible than the old route. Plus, it enables materials like these which were, in this context, accidentally discovered using computers.

It is this ability to simulate and iterate – at incredible speeds – that allows us to evaluate the massive numbers of permutations and combinations to discover these types of solutons.

The second impressive aspect of this discovery, to me, is that we are now finding new uses for common foundation materials.  We are now beginning to discover the unique ways in which materials can be combined to create just the right conditions to product energy from something as common as sunlight.  If we can now rapidly transform this type of discover into an actual commercial material – which might take years – it could have immeasurable impacts of the lives of humans all over the globe.

The beginning of commodity telepresence …

With the growth of the Internet, and wireless extensions of the Internet (via Wifi and broadband wireless), it is inevitable that we are going to see our abilities to participate in remote events enhanced through telepresence.  There are numerous companies working in this area for “high-end” solutions … Cisco being one of them.  But what is now facinating is the growth from the bottom up … the small, inexpensive “toys” that are beginning to show up in the market.

Telepresence is the ability to be present somewhere else … so that you can interact with the world at that remote location without actually being there.  Companies like Cisco are working on this for various communications solutions … so that you can have virtual meetings or present at a conference when you are not actually there.

iRobot ConnectRLast week I found the iRobot ConnectR which is an impressive start.  The ConnectR is referred to as a Virtual Visiting Robot and is built on the same iRobot Roomba vaccum platform.  It is able to leverage all of the features of that platform, including the transport, navigation, and auto-recharge/dock features … but probably without the vacuum.  What has been added instead is a tilting webcam, microphone, and speakers … along with a Wifi wireless radio.  Once you have purchased this unit, you can put the charging station in a corner, and then remotely – either from your home, or across the globe – connect to the ConnectR.  You can then “drive” it around your house … looking through the webcam, and listening through the microphone … and then even talk to anyone or anything that might be around.  You could be using this to check on your house, your pets, your children, or even your parents.  And yes … iRobot is already exploring the various issues of security, and how you control access to the ConnectR remotely … and also how you can locally disable the robot.  This robot is currently in pilot/beta, and is estimated to cost $499.99 … so not the cheapest … but as an owner of a Roomba I can guess the quality will be there.

Spykee RobotYesterday on my way home from Oakland, California I was flying on Southwest Airlines and found yet another iteration of the commodity telepresence robot … and this one is also very impressive.  It’s being promoted as a “toy” byt Erector … yes, the folks that used to make Erector Sets.  Erector has since been bought by Nikko, an innovative manufacturer of electronic toys … which are growing rapidly in capabilities.  The Spykee Robot is a base platform with treaded tracks containing a Wifi radio and basic processor.  This basic platform is designed to allow the owner (uh … not just kids … I want one!) to build a “torso” using Erector Set parts.  There are supposed to be three models coming … with slightly different parts … and I’m sure a bunch of add-on kits.  On the torso is again a webcam, speakers, a microphone, and various lights.  The Spykee also has a PC “control panel” application that allows you to remotely – in your house, or across the globe – connect to the Spykee and cruise around interacting with the world.  There is also a “security” feature which allows the Spykee to be watching via it’s webcam, and to e-mail you photos when motion is detected.  The Spykee appears to be close to shipping in the UK … Amazon says that the US availability is November 15th … list price of $299.99.

Wow Wee RovioWhile investigating the Spykee, of course I came across the Wow Wee Rovio.  This is another $299.99 remote telepresence robot, with a browser based control panel.  The three wheeled unit contains the processor, Wifi radio, webcam, microphone and speaker, along with some sort of optical tracking – similar to the iRobot units.  There is a docking station for recharging, and optional tracking beacons that canbe bought – I’m guessing – to enhnace the navigation of the robot around your house.  This unit appears to be available now in the US, and I’m surprised I haven’t seen one in the stores yet.

What is impressive is that these last two units are now below $300 for a complete – basic – telepresence robot.  The example videos provided by both companies demonstrate both home and office uses, and begin to move into the home security space … and even the home video surveillance space.  As we saw with the prices on Wifi Access Points, and other hardware … I can only expect it to continue to fall.

The next area that I am going to look at for these robots, and something that I believe will be important for success, will be the “hackability” of the control protocols.  How easy will it be for hackers to begin to enhance the controlling applications, and for the robots to be integrated into more extensive applications?  Imagine when someone has created the automatic search and mapping application for the Rovio or Spykee that allows someone to release dozens of these into an unknown building and have the robots quickly survey the inside and report back what they find.  What other applications are going to emerge?  Maybe I can rent a remote Rovio to explore a remote location that I otherwise might never visit!  Explore the Louvre after hours?  Visit underground caverns on the other side of the planet?  Hmmm … maybe there is a interesting business model in there somewhere …  🙂

HUD Homes – Investment and Ownership

ngs have been busy lately on the development front.  Our College Football site is doing well, and growing steadily.  We now have launched a new website this month called GoHUD.com that is a resource for home buyers and investors to search the portfolio of HUD Homes that are currently available through the US Federal Government.

GoHUD.comHUD Homes are an interesting opportunity for home buyers, and also investors, who want to purchase a home in almost any state across the country.  A “HUD Home” is one that is being sold by the Department of Housing and Urban Development.  When someone with an FHA-insured mortgage can’t make the payments, the lender forecloses on the home and HUD takes ownership.  It is then offered for sale at market value, based on a recent AS-IS appraisal, meaning, the market value in its current condition.  Obviously with the current financial market, the number of available HUD properties has doubled over the last 6 months, and some states have thousands of available properties.
HUD properties are made available in a variety of ways, with an emphasis on “Owner Occupied” purchases that include the “Good Neighbor Next Door” buying opportunities for “Law enforcement officers, pre-Kindergarten through 12th grade teachers and firefighters/emergency medical technicians.”  This last group of individuals can get incredible discounts on the purchase of a HUD property.  The prices on HUD Homes tend to be very price competitive in the areas where they are located, and each will have an associated Property Condition Report.

When watching the prices on the homes, you might notice a common pattern as a particular home stays on the market.  The home will be listed, and be available to “Owner Occupants” … meaning that you must live in the home as your primary residence if you purchase it.  If the home is not sold in a week or so, then the status will change to “All Bidders” … meaning that anyone can now bid to purchase the property for any purpose.  If the home still does not sell, it is common that the price will then be reduced by a full 10%, and the status will return to “Owner Occupants” again.  This process will repeat until the home sells.  Owner occupant, all bidders, price reduction … and on and on.

Our current website aggregates the details of all available HUD Homes across the country, and provides a variety of ways to search, map, and “favorite” any property in the database.  We’re continuing to add features to automate your monitoring of a property and the status changes, and also to engage with a HUD Authorized Real Estate Agent, who can assist you with navigating the various issues in researching and eventually purchasing a HUD property.

To me, it has been a fun project to learn more and more about HUD Homes, and how they work … and the opportunities that exist.  I’m not yet sure if the real estate market is close to the bottom yet, but I’m watching homes in a wide range of cities across the nation for an eventual investment opportunity.

Check out the website and let us know what you think!

College Football

College FootballWow … I really haven’t paid attention to how long it has been since I have blogged.  I have been absolutely heads-down on two major Internet projects … my SMS/Text Messaging platform, and a new partner site – CollegeFootball.com.

CollegeFootball.com has become an interesting experiment that involves a social network with news and information services. We are now evolving that platform to see what gains traction with the fan base, and what features the average College Football fan is after.  Thee are, of course, the fans wanting to find out game information, standings, ratings, and othere team related information, but then there is also all of the “social” aspects of College Football that we are wanting to support and promote.

The Social Network
The site is built around a foundation of a social network and user-generated content.  Anyone can come and join the site, they are then able to invite friends, search for friends, and form on-line relationships.  One area that we expanded on is the relationships.  Instead of just having “friends” we chose to implement multiple levels of relationships – Close Friends, Friends, Family, and Acquaintance – so that you can more precisely control who is able to see your content, and how they interact with you.

The User Content
The user contributed content is currently in the form of posts, or articles, photos, and events.  We’ll soon be adding videos also.  On all contributions the fan is able to specify the “visibility” or who is able to see that particular piece of content. The choices are Close Friends, Friends, Family, Acquaintance, and Public … with Public being the default.  With the additional levels of relationships and the visibility flags, a fan could share posts, photos, or events with Close Friends and no one else.  They could also share them with Family, which would then make them visible to Family, Friends, and Close Friends.

The other important attribute that can be assigned to any post, photo, or event are a Primary and Secondary School.  This allows the fan to relate their content to schools or teams so that other fans can search and locate content related to their favorite – or not so favorite – teams.  These assignments also allow the site to associate content with particular school pages, and also make it more visible to our “governing users” … which I’ll touch on later.

Posts
Posts, or articles, are like very simple blog posts.  Right now we do not offer a rich-text editor (which is coming) and inserting images in posts is a manual process, but we’ll be improving this in the coming weeks.

Photos
Fans are able to create Photo Albums and then upload images to their albums.  Again, the visibility features allow very fine grained control of who is able to see what photos.  Within an album, individual photos can have different visibility settings allowing some photos to be seen only by Close Friends, and others by Family.

Events
With all of the activities surrounding College Football, events are a way for fans to announce tailgate parties, other pre-game events, post game celebrations, and even places where games can be watched together.  As with the other content types, the visibility can be set, and the events can also be associated with a school or schools.  We’ll also be adding some invite and reminder capabilities to the events.

Commenting
All of the content types on CollegeFootball.com can be commented on by registered users.  The commenting system provides for two levels of commenting … comments being made directly on any piece of content, and then replies to comments.  The replies are only one-level deep, meaning that you can not reply to a reply.  When comments are made, the content owner is notified via e-mail, and when replies are made both the content owner, and original commenter are notified by e-mail.

As of this coming weekend we’ll be deploying a new build which will add Profile Commenting, or writing public comments on other fans profiles.  We haven’t nailed down what we’ll call this feature … writing on a fan’s what?  Feel free to comment below with suggestions if you have any!

From the Fans
As fans contribute content, all of the content marked as Public can then be viewed on our From the Fans page.  Registered users can then also rate (Cheer!), and comment on the content.  There are a number of filters that can be modified on the From the Fans page to customize the view and assist fans in finding what they are interested in.  The first level is “Most Recent” or “Most Popular”.  The default is Most Recent and shows the content most recently contributed on the site.  The Most Popular sorts the content based on the number of Cheers (votes) that a piece of content has received.  Additional sorting and filtering are by the content type, allowing for only viewing posts or photos or events, and also the timeframe of contributions.

A fan is also able to view the contributions by school or team.  By making this selection they can drill down to user content that is only related to a team they are truly a fan of.

Governing Users
Also core to the design of the platform is a concept that we call Governing Users … these are “editors” or users who can be given control of “promoting” certain content to school pages that are under their control.  When we grant Governing User privileges to a user we also pick the pages they have authority over.  For example we might give one user authority over the University of Utah page, and another user authority over the Army and Navy pages.  From then on, as they explore the site and the fans content they see a small icon that allows them to “promote” that piece of content to a public school page.  The governing user, when they click that link, is presented with the list of pages they have authority over along with date selectors on when they want that content to appear and be removed from that page.

Governing Users are then able to use the From the Fans page to also look for interesting content, and promote the content that they feel would be the biggest contribution to fans of a particular school or team.  It is this feature that then splits the site into two completely different “sides” of the site – the general public user contributions, and content that is deemed as being worthy of showcasing.  Both sides are always visible and accessible, and then both serve different purposes.

What’s next?
There is a lot more that I could talk about … and a lot more coming … but it’s been fun so far.  The one thing that we did is make the entire platform independent of the specific content.  So although this is currently College Football related, we could use this same platform for any type of social network that comes along.  We already have some leads on additional domain names where we might apply the platform … it’ll be fun to see how it goes.

Please go and check out the site … use the Feedback button on each page to give us feedback (being nice of course, and understanding that we built this – from scratch – in 8 weeks!) … I really want to hear from fans what they like, what they don’t, and what needs to be added!

Video Stitching, Processing and the power of computers …

aw this amazing Stabilized Video Collage this morning while reading … this is really impressive.  You have to see it to really appreciate what is being done.  As the author writes:

While some people are still endlessly yelling at Flickr for the new video features, some others are experimenting around it. Especially the one that were already experimenting with pictures, doing collages, blanding, panoramics or any kinf of litle planets. On of my contact on Flickr is PaintMonkey, a very inspired photographer when it comes to photocollages. His photostream is really amazing and creative. He recently explored around the concept of “long photos” (as Flickr calls its videos) and quickly came with some collages experiments. As I was by my side discovering Motion 3 and its stabilization features, I came up with this Stabilized Collage experiment.I am really impressed about the simplicity and the quality of Motion’s stabilization features. It’s really quick and simple to work with. The first time I really got the stabilization concept was when I saw the stabilized version of the JFK assassination Zapruder’s video. At this time, I guess this has been done by hand, frame after frame, but now it’s as easy as click-and-process.

He has posted several other examples of his work – HipHop By The Canal, Skatepark At The Canal, Late Sunday Morning At The Farmers market, Stabi Portrait B. v4 – along with his Stabilized Video Collages, How-To.  All of these make me think about what could be possible with this … as an art form, and other possible uses.
There are the immediate thoughts about using several cameras to record at the same time, and then sync up the videos … versus one camera recording the same scene (slightly offset) at different times.  The creation of huge panoramic videos becomes possible as very affordable prices.  I also started to think about the creation of a collage of an outdoors scene that would be made up of video taken from the four different seasons … mixing the beauty of each season into one field of view.
Amazing … “click-and-process” … compute power, and the complexity of software is ever increasing … and amazing works like this show up!

Telecosm 2008 – Quantum Entanglement and the Next Phase …

Carver MeadThe wrap up of the Gilder Telecosm conference is always one of my favorite presentations. For the last number of years, it has always been Carver Mead speaking … and he is an incredible man. It’s not only his accomplishments, but his presentation itself … his presence … his speech … his wisdom.

Carver Mead, Internationally known author and educator; Holder of the National Medal of Technology; Founder of twenty-five companies; Gordon and Betty Moore Professor of Engineering Emeritus, California Institute of Technology

Some of his work includes the foundation technology behind Foveon, “a world innovator in the design and development of image sensors and image capture systems for a wide range of digital capture products.” If you aren’t familiar with his company, you can read about the Foveon technology … it is amazing, and well know to extreme photographers. In additional this year another company with his involvement Audience presented here … creating sound processing silicon modeled after the human ear and brain.

Part of the introductory presentation was by Louisa Gilder, who wrote the book The Age of Entanglement: When Quantum Physics Was Reborn. She gave a brief history of the debates that go back to the 1930’s about quantum theories, and the various experiments up through the work of John Bell.

Carver was introduced by Lloyd Watts, Chairman and Chief Executive Officer of Audience. Lloyd had been a student of Carver’s at CalTech. He did a wonderful introduction and thanked Carver for the impact that Carver has had on so many people’s lives … including his won.

Carver begin his presentation by talking about research that has shown the inaccuracy of perception and reality … and how that relates to human behavior. He had one study that showed where the bottom quarter of a class, who had taken a test and were in the bottom 10 percentile, tended to think they had not done that bad … even above average. While the best and brightest who were in the top quarter … scoring in the 90th percentile … tended to question how well they had done … and were unsure of their ability … and tended to guess lower than they had done. His point was how real genius is often able to question their own knowledge, and to be open to what they do not know … to always question their own knowledge, and to “push the limits of human knowledge.”

He ventured into a philosophical conversation about creativity and genius. He questioned where ideas come from … true inventions or innovation. He touched on eastern thoughts where they believe that all things undiscovered, and un-manifested, exist in the “darkness”. They exist, but have not yet been found by us … and will only be found when we choose to explore the darkness searching for them. He expressed that this is only one way to view this phenomenon, and that we each have our own ways to clear our minds, to explore the darkness, and to create. I enjoy thinking about this as I find the most powerful way for me to “explore the darkness” is to break my usual patterns, and to do those things that i wouldn’t usually do … do go and do things that I usually wouldn’t.

Carver then talked about the power of words. The importance of words. How the words that we choose to use define the context that we operate within. He talked about the ways that people begin to create models of new concepts … and that the language that they use to define these new models automatically begin to limit themselves. I have also been taught this same idea from numerous sources … that as we grow up we begin to become “loose” with our language, and underestimate the power of what we say … and don’t say. We often use “weak” words, and express disbelief and disappointment. We hear stories about the power of positive thinking, but fail to realize how much of this is expressed in our speaking. If we are not always thinking and speaking in a positive way, it can poison our whole way of being. This includes how we talk about ourselves, others, and what we are up to in life.
The next area of conversation that Carver explored was the extension of our senses. He expressed that when he is working on his research, and looking at an oscilloscope connected to some semiconductors, he is not looking at the waveforms but instead using that as a way to visualize in his mind the individual electrons flowing through the semiconductor. He related this to using a hammer … talking about when you first started using a hammer for the first time, it was awkward and you fought to control the direction and impact. But as you become better and better with using a hammer, it actually starts to become an extension of you … and you begin to gain control over directing the impact, and also sensing if you made good impact with your target. Likewise he said that as you gain skill at using any tool, you can allow that tool to become an extension of you … and and to allow that tool to give you new ways to see what is going on.
As he wrapped up, he posited that all matter is waves … there are no particles … no points of energy. All matter is waves. Nothing but waves. The energy of matter is it’s frequency, and it’s wavelength is one over the momentum … the vector of momentum. I had to stop taking notes to focus on what he was saying, but it was an incredible way to visualize matter. He talked about electron not being in “shells” or “orbits” around the nucleus, but instead to consider that electrons take on different “wave functions” wrapped around the nucleus. He then also discussed how due to the charges, and the electrons wanting to avoid each other, they are forced to take on different wave functions in avoidance. Some of these wave functions can become more and more convoluted, and then can become “smoothed” when these atoms get near other atoms … creating molecules. And that the binding nature of molecules is formed by the more natural wave functions that are formed through the combination of atoms.

There is so much more that he said that I didn’t get down in notes … I really wish that Gilder would post the podcasts and vidcasts of these presentations … they are an outstanding source of new perspectives. Carver Mead is an amazing man … what a contribution to society.

Telecosm 2008 – scaling Internet backbones …

Infinera SolutionsThere were a number of cool presentations today with a focus on the semiconductors and optical components … and various network processing units and multi-core general purpose processors for high-speed backbone networking. It’s actually a fascinating subject area that few people seem to really be aware of. We all take the bandwidth to our homes as a given … and to our businesses, and to our hosted servers, blogs, flickr, twitter, and YouTube and on and on. But how is all of that backbone bandwidth … running over the fiber that connects us to our favorite sites, services, applications, and videos … actually built out?

Well … there are a number of vendors that provide the bulk of the equipment, and within that equipment there are providers of the subcomponents and silicon behind the massive amounts of bandwidth provided by the Internet. Cisco, Juniper, and Alcatel add up to 90% of the market for the really high-end backbone switching and routing gear.

The dominant solution in this space is using DWDM – Dense Wavelength Division Multiplexing – to place multiple colors of light on the same fiber, with each color carrying it’s own data. The current “commodity” speeds that are being sold are running at 40Gbps per link, and these are then installed in 4, 8, and 16+ slot chassis providing up to 1.2+Tbps packet switching speeds. Yes … that is 1.2 Terabits per second … 1.2 trillion bits per second … or about 120 billion characters per second. Kinda’ fast.

Some of the presenters today were Infinera, Luxtera, Photonic, and BroadLight. If you want to learn more about DWDM … Infinera has some cool videos that provide some details about their products on the Infinera Videos page. Their first video demonstrates how DWDM works …

During the presentations, there were a few stats that really stood out to me. One of these was the current average backbone bandwidth, per US carrier. Here is the US, although the presented admitted that it varies, the average was pegged at about 400Gbps+ on their backbone links. The key is that the estimates are an average of a 75% growth in the next year!

One of the examples of the calculations was based on taking this forward for 10 years … so if the internet grows at 70% per year for 10 years … using the current 40Gbps DWDM optical technologies:

  • 15 million DWDM transponders will have to be added
  • 165 million mechanical fiber couplings will have to be installed
  • 4 GigaWatts of additional power will be required
  • AND … In 10 years, they would be installing 4000 DWDM transponders PER DAY …
  • … requiring 2000+ more technicians!

Infinera was presenting on their upcoming 100Gbps optical technologies, and also mentioned their eventual 400Gbps product, followed by 1Tbps, 2Tbps, and 4Tbps chipsets. Obviously as they – and other companies – are able to deliver these higher capacity solutions, there will be smaller numbers of units required to keep pace.

To me it’s impressive to see that people are working on creating these next-generation solutions to boost the capabilities offered …. to ensure that the Internet backbones can keep pace with the demands for bandwidth being created by us users.

There is a lot going on in the industry to cope with Internet bandwidth demands …