Telecosm 2008 – Bob Metcalfe on Energy

Bob MetcalfeBob Metcalfe, of Ethernet fame, did a presentation this morning on his current investments in energy. He related, throughout the presentation, ways to link the progress of the Internet to work that could be done in looking for cheap, clean, effective, alternate energy sources.

Green Fuel – one of his investments, is exploring the use of algae to create feed, food, and fuel from the CO2 emissions from smokestacks. Dried Whole Algae can be used for feed and food. (He explained that Omega-3 fatty acids that we get from fish actually comes from the algae they eat … and suggested that we “cut out the middle fish”), Algae Oil can be used for biodiesel. Dilapidated meal can be used for foods. The “carbon credits” can also be gained … but are insignificant.

He also suggested that “Green” is not necessarily the right “color” to call the clean energy movement. It turns out that the “Greens” in the political realm seem to be more than just “Environmentalism”, but also Anti-technology, Anti-capitalism, Anti-trade, and Anti-American. All of these are not going to solve some of the real big challenges. Bob suggested that we consider it the “Blue” movement … he believes that the best solutions will be related to the sky, and oceans. His mantra is “cheap and clean” energy.

Bob briefly reviewed some of his other investments – Mintera is one of his investments, and they are doing 40Gbps Long Haul DWDM Optical Transmission solutions … allowing people to telecommute. SiCortex is creating supercomputer Linux Clusters … delivering more compute power per dollar, per foot, per watt. Ember which is creating control systems using Zigbee Standard CMOS Radios and Protocol Stack. Infinite Power Solutions which is creating Solid-state Thin-film Lithium-ion Batteries with nearly unlimited recharge cycles.  He also mentioned I366 Technologies (which I couldn’t find) that are creating next generation Silicon Solar Cells, and SiOnyx who are creating Black Silicon which has very useful photonic properties.

He stressed that Energy creation and usage are not directly tied to the Environment. They are both very independent issues.  If Global Warming were solved tomorrow, we would still want cheap, clean energy! The solutions – if we want them – to Global Warming are about Climate Control! He questioned where the real research is going on related to numerous climate control solutions.

He ended by joking about the day that we would get to ask the United Nations “Now that we can control the planet’s temperature … exactly what temperature do you want the earth to be?”

Yeah … that would be the day … I’m sure they already have that all figured out. Oh … and there would be no side effects at all to stopping the variations in planetary temperatures … right!

Telecosm 2008 – Cloud Computing and The Exaflood …

Nicholas was here at Telecosm to present about the shift – the “big switch” – to cloud computing. He reviewed the background on the evolution of electricity, and drew the parallels between the early days of creating your own power, to moving to a model where the power grid is a commoditized asset.

Nicholas reviewed the implications of Rethinking the Data Center: Virtualization of computing and storage, Consolidation, Programmable environments, Automated management, Multi-Tenant facilities, and Energy Efficiencies.  For anyone familiar with what Amazon is up to with AWS, and now Google with the Google Application Engine (GAE) this is all well known.
The next presentation was by Andrew Odlyzko who spoke about the Exaflood … the growth of Internet traffic. It was filled with facts about the current state of Internet traffic … and some predictions on the future.  One interesting fact … right now, the growth rate is actually slowing, even though the hype is accelerating.  Internet traffic growth is occurring … just not as fast as it has in the past.
Here are some of the more interesting numbers that Andrew talked about:

  • Qwest CTO Estimate: IP traffic to go from 9 PB/day in 2007 to 21 PB/day in 2012
  • Estimates for Internet traffic growth rates
    • mostly in the 50%-60% per year range
    • With 50% growth rates offset by 33% decline in cost … not much change in overall costs to support new levels of traffic.
  • Year-end 2006 worldwide  numbers:
    • digital storage: 185,000 PB
    • Internet traffic: 2,500 PB/month
  • Year-end 2006 US Internet traffic per capita:
    • 2GB/month
    • TV consumption ~40GB/month (assumes 3hr/day, 1Mbps, no HDTV)
    • TV/Video over the Internet will add *some* traffic, but not massive new numbers
  • Wild numbers about revenues to providers.  Revenue per MB:
    • SMS = $1000 / MB
    • Cell voice calls = $1 / MB
    • Wireline voice calls = $.10 / MB
    • Residential Internet = $.01 / MB
    • Backbone Internet = $.0001 / MB

The last figure that was shared during the panel discussion was that Eric Schmidt indicated last month that Google is currently accepting 10 hours of YouTube video per minute!  That comes to 14,400 hours of video PER DAY being uploaded to YouTube … absolutely amazing volume of data.

Telecosm 2008

This weekend Andrea and I came to the east coast – Lake George, NY – to have some fun, and to attend the Gilder/Forbes Telecosm 2008 conference. It was a lot of fun Sunday through Tuesday exploring the local area, going hiking in the Tongue Mountains, and having some great food. The Sagamore is an amazing resort on Green Island in Lake George, and we are staying in a very nice condo on the bay.

Last night we took a great cruise on the Adirondack – a 115′, three deck ship – that cruised around the islands at sunset while we had dinner on board. It was fun to meet some of the other attendees, and the evening wrapped up with a couple of brief presentations by two authors of current books Lawrence Solomon, Author of The Deniers, and Howard C. Hayden, Author of A Primer on CO2 and Climate. It was really good to hear different opinions about global warming, alternate sources of information, and new places to read more about some current theories and measurements. I’ll read the books and then see what I think.

This morning kicked off with George Gilder giving an introduction to this years Telecosm, and then Steve Forbes talking about his opinions on the current financial and economic trends in our country. Overall, he is much more bullish about things, and although he acknowledges that the Federal Reserve has made some errors, he seems to believe that we have a lot of opportunity before us. He feels that oils prices are currently a bubble, and that if you measure oil as a comparison to gold prices, the increase is not as large as claimed. He does feel that a change in monetary policies is going to burst the oil bubble. He is worried about a number of tax cuts that are going to be coming up for renewal in 2010 – capital gains and death taxes – both of which could reduce available capital for entrepreneurs and capitalists.  Steve is always great to listen to …

AsteriskNOW … configuring to use VoicePulse

AsteriskNOWI was able to get my AsteriskNOW system upgraded, and so now on to the next step … adding a new set of VoIP channels (phone lines) and a new incoming phone number. I wanted to do all of this via VoIP so that I can learn what it takes, and how to do it … and it’s been quite a learning lesson.

First, I’m using AsteriskNOW v1.0.2 and it includes the ability to add VoicePulse as a Service Provider out of the box. They actually provide support for several different providers, however I got VoicePulse as a recommend from a friend of mine. I figured that their rates looked very good, and I’d give it a try and see how things went …

VoicePulseThe first step in setting up AsteriskNOW to use VoicePulse is to set-up your VoicePulse account. That involves going to their website, and creating an account … and purchasing your initial credit with them. Once you fill out the information on their on-line form, they will e-mail you a credit card authorization agreement, which you have to complete and fax back to them. If you then call them they will grab your fax and activate your account. Once that is completed, the will give you the password to your account so that you can log into the VoicePulse Connect portal … where you manage your VoicePulse account and get all of the details to get things configured.

Once you can log into your VoicePulse account, you’ll want to go to the Credentials tab on the UI. This is where you’ll find your Login and Password for your channels. Now, I’ll do my best to explain channels here … since your base VoicePulse account comes with four of them. Channels are “like” phone lines, but do not necessarily have a phone number associated with them. What value is that? Well … you can make outbound calls. So with a base account, you get the ability to make four simultaneous outbound calls, and you’re charged by the minute to use them. Now you might be asking “What caller ID will show up to the people I am calling?” .. I asked that also. Asterisk allows you to assign that value when you configure your system! Any how … I digress …

So now that you have your VoicePulse credentials, the next step is to put those into AsteriskNOW. If you login to your AsteriskNOW admin web page, you can go to the Service Providers menu, and click the Add Service Provider button. Select a Provider Type of VoIP, and you’ll then see VoicePulse listed. When you select VoicePulse, you then be prompted for the Username and Password that you got from your VoicePulse Credentials web page. Enter both values, and click Save.

You should now see your VoicePulse account appear in your List of Service Providers. That is almost all ther eis to do! I did change one additional setting to get things going … I had read this on a page that I had Googled. To the right of your new entry, there is a pull-down menu labeled Options … select that menu, and then choose Advanced. On the dialog that appears, I put the value 5060 in the Port field. Click Update and you are done!

What is amazing is that you are now ready to go! That’s all there is to it. If you are familiar with Asterisk, you can now begin to configure both incoming and outgoing configurations! In my case I wanted to configure two additional items – a DialPlan that allowed me to make use of these new lines, and also add a new incoming number (DID).

To add access to these lines, I simply went into the Calling Rules in the AsteriskNOW admin portal, and added a new Calling Rule into my DialPlan … since I’ve been using the “dial 9” and “dial 8” convention for my other lines, I added a rule to “dial 7” to use the new VoicePulse channels. Done.

Then, to add my new phone number I simply went back to the VoicePulse portal, and selected their Numbers tab at the top. Scrolled down to where it says Add Phone Numbers I then selected my state, selected the area code, and then the city … and then chose a phone number. Clicking the Activate Selected button then activated my new number. Now to configure AsteriskNOW … I went back to my AsteriskNOW portal, and chose the Incoming Calls menu item. I then clicked the Add an Incoming Rule button and defined my new rule … All Unmatched incoming calls, from provider VoicePulse, to extension 5000 (one of my Ring Groups). Done.

Within 30 minutes, I called my new VoicePulse phone number, and my Ring Group was ringing … all too simple. I also went and grabbed a copy of X-Lite … a free SIP phone … and pointed it at my AsteriskNOW box … it connected right up and is working great!

AsteriskNOW – Upgrading Beta 6 to Release v1.0.2

AsteriskNOWAsterisk is one of the amazing projects of the Open Source world. AsteriskNOW takes that project even further by creating a complete turn-key package that is extremely easy to install and configure. With AsteriskNOW version Beta 6, I was able to take an old Dell PC that I had, buy two $20 cards, and set up a two line answering system with call menus and call routing … and voicemail that e-mails me attached .WAV files. Oh … and was able to do all of that in one evening … maybe 3-4 hours from start to finish.

Well, Beta 6 was released a while ago, and I’m now wanting to add some new ITSP (Internet Telephony Service Provider) services, and remote “over the Internet” phone extensions. In talking with the ITSP support people, I really needed to update to the released version of AsteriskNOW … v1.0.2 … and so I started looking around for the instructions on doing it … which wasn’t easy to locate. After the right combinations of keywords, Google finally pointed me at this article – [*NOW-1.0.1] Status: RELEASED!! (Officially) – which got me going in the right direction. As I’m walking through the steps right now, I figured that I would write about my experience … and add some more detail of my experiences …

  1. SSH into your AsteriskNOW box … using ‘admin’ and your password
  2. At the command prompt, you now want to use the following command to update ‘distro-release’:
    • sudo conary update distro-release
  3. You will see the output of the command scroll by, as the latest configuration settings and information for AsteriskNOW are retrieved
  4. When the update is completed, you’ll be dropped back to the command prompt
  5. Open a browser and point it to the rPath Appliance Platform Agent: http://{yourServer}:8002/rAA
  6. When the page opens, you’ll be asked to login. If you have never logged in here before, the default username is ‘admin’ and password is ‘password’. When doing this the first time you’ll have to complete a series of questions to configure this.
  7. Once you are logged into the rAA, there are a series of navigation links going down the left side … select “System Updates”
  8. On the page that appears, click the button to “Check” now for updates!
  9. You will see the page update with the status as the list of required updates is acquired.
  10. When this stage of the process is completed, the page will refresh, and a new button will appears that will allow you to “Apply” the updates.
  11. Click the Apply button. (Easy .. huh?)
  12. Now, you’ll get to watch the process of downloading and installing all of the required packages. In my case there were 154 packages, and the process was filled with all sorts of downloads, installs … and delays. And more delays. Why I’m writing this blog post is due to the huge delay that I am experiencing now … with 124 of 154 downloads done … and installing 118 of 154 updates. In reading more on-line … it appears that these hangs are common. Bummer.
  13. After 30+ minutes of hanging … I clicked away from the System Updates page. I then got back to it, but had all sorts of errors … crap … crap … crap …
  14. Chose the Reboot option. Said small prayer …
  15. Server failed to come up … went downstairs … it was shutdown. I powered it up …
  16. Came back to the rPath Appliance Platform Agent: http://{yourServer}:8002/rAA
  17. Had to accept a few more wizard questions … then I got back to the rAA page with left navigation links.
  18. Select “System Updates” … again …
  19. Now I have 68 downloads, and they are downloading and installing …
  20. After the process completes … you are told that the services are restarting!
  21. Reboot your server … just to clean things up …
  22. Go back into the rPath Appliance Platform Agent: http://{yourServer}:8002/rAA
  23. Select “System Updates” … mine had one more ‘kernel’ erase to get rid of an old kernel.
  24. Point your browser at the AsteriskNOW box … and …
  25. And now … it still doesn’t work! 🙁 Crap … crap … double-crap!
  26. I started to ask questions on the #asterisknow irc channel … (Thanks bkruse!)
  27. Looking in the logs I found an error loading modules from a bad path /usr/lib64/asterisk … I’m not using a 64 bit machine …
  28. SSH into your AsteriskNOW box … there is an error in the /etc/asterisk/asterisk.conf file …
  29. At the prompt enter: sudo vi /etc/asterisk/asterisk.conf
    • for some reason there was a reference to the /usr/lib64 directory for modules …
    • look for the line that reads: astmoddir => /usr/lib64/asterisk/modules
    • edit that line to now read: astmoddir => /usr/lib/asterisk/modules
    • save the file and then reboot the box .. again …
  30. Alrighty … open your browser and go to the admin page … crap. Nothing.
  31. More questions on the #asterisknow irc channel … (more thanks to bkruse!)
  32. SSH in to your AsteriskNOW box … again …
  33. At the prompt enter: sudo vi /etc/asterisk/http.conf
    • look for the line that reads: ;prefix = asterisk
    • edit the line to now read: prefix =
    • save the file and then reboot the box … again.
  34. Point your browser at the rPath Appliance Platform Agent: http://{yourServer}:8002/rAA
  35. You should be done now!

Now … I’m guessing that I could have maybe just done a good backup, and then installed the Release v1.0.2 and then restored from backup … however I couldn’t find ANY documentation that explained if that would work.

AsteriskAlso, your own experience might vary … from reading comments on the web page above, it seems that people have had hangs and lock-ups multiple times, and at different places during the update.  It seems that I was able to recover from most of the issues that I ran into.  The biggest resources in this case were Google and the IRC channel … using both of these allowed me to complete this process … although it took me the better part of the day to do so.  Another key to debugging was the various log files … make sure to look in /var/log/asterisk/messages … lots of good stuff there.

Even with all of the hassles, I have to admit that I am really impressed by Asterisk.  The next steps for me are to upgrade the cards that I’m using in the box … I’ve got some older X100P cards, and they work … but the quality is not the best.  I’m now about to sign up for VoicePulse to get new phone service for my wife’s business.  This will be the next step in using Asterisk for me.  Lastly, I ordered a S101I IAXy to create a remote line at her travel office in Salt Lake City.  Once I get that installed and working, we’ll be able to integrate with their PBX and have her admin answer the phone for her when she is not available …

I’m looking forward to having it all working!  If it goes well, then I’m buying a second phone number from VoicePulse for my new company …

Man Machine Interface Improvements … Rats and Monkeys

I love to follow the advances in Man/Machine interfaces. From a long time back people have been experimenting with both invasive and non-invasive interfaces, using a variety of methods to monitor both brain and nerve signals.

Neural implants continue to make huge advances, and the probes and various hardware required are also advancing rapidly. When you look at companies like Cyberkinetics (Actually Cyberkinetics Neurotechnology Systems, Inc. with R&D here in Salt Lake City, Utah) they have an expanding product line which includes the BrainGate Neural Interface System:

The BrainGate Neural Interface System is an investigational medical device that is being developed to improve the quality of life for physically disabled people by allowing them to quickly and reliably control a wide range of devices including computers, environmental controls, robotics and medical devices.

Besides the presentations about Jesse Sullivan – the “bionic man” – that I have seen, I just saw the most impressive demonstrations on YouTube … of course. I was actually watching the History Channel at home the other night and saw a short clip about the Roborat. Well … YouTube had the Roborat video, and if you haven’t seen it …. you’ve got to watch it. Researchers have now inserted probes into the brain of a rat to allow remote control of the rat! They even added a wireless webcam to allow the controller to see what the rat is seeing.

The part of this that is wild is that neural stimulation is being used to both cause the rat to turn left or right, but also to stimulate the pleasure center of the brain to provide reinforcement for the actions. The rat will continue to learn to “obey” the senses driving it, in order to gain the pleasure stimulation.

This is a variation of the research being done with monkeys and additional appendages. Check out this YouTube video of Monkeys controlling a Robotic Arm through thought! With arrays of neural probes inserted into their brain, a computer monitors the brain activity and moves the arm. The monkeys have actually learned how to control their brain activity to cause the intended motions. Feeding themselves with a robotic arm …

Now … if Ray Kurzweil is right, and we’ll eventually be able to perform neural stimulation through blood-borne nanomachines, then this type of work could be done non-invasively … or at least not having to go through the skull. Imagine that you might just get injected with a syringe of nanomachines that have the ability to stimulate your neurons … and turn you into a remote control human! 🙂

Impressive Commuter Vehicle … the Aptera Typ-1

Aptera Typ-1 Passenger VehicleWow. I know that other people might be well aware of this vehicle … but I just found it and this is impressive. Aptera Motors, Inc. – based out of Carlsbad, California – is creating a revolutionary commuter vehicle and is about to go into production. For the last five years they have been designing and testing this new Typ-1 passenger vehicle that is like nothing I every imagined being this close.

The Aptera Typ-1 details are amazing …

The Typ-1 uses a commoditized, ‘ruggedized’ 3-phase motor controller designed for vehicular applications, and a 3-phase motor made for us by a company here in Southern California. The rear drive suspension, and the drive reduction, are all designed and made by Aptera. Since the Typ-1e (electric) and the Typ-1h (series plug in hybrid) have different battery needs, this may result in different battery manufacturers for the two models. The Typ-1e is designed to use a 10 KWh pack, while the Typ-1h uses a smaller pack. The cycles and DOD are different for both applications. We will announce further information regarding the battery lifespan and warranty policy well before we begin manufacturing the Typ-1 next October.

Diesel or Gasoline? Our first prototype, the Mk-0, was a parallel hybrid Diesel and achieved an average of 230 MPG at a steady state of 55 MPH. This was pure Diesel/mechanical drive with no electric assist. Diesel is attractive for its Carnot efficiency and the increased enthalpy of Diesel fuel vs gasoline. However, diesel contains lots of unburned hydrocarbons and NOX compounds, and it’s impossible to get a small Diesel engine certified for emissions in California. Therefore, the typ-h uses a small, water-cooled EFI Gasoline engine with closed loop oxygen feedback and catalytic converter. This engine is coupled to a lightweight 12KW starter/generator. It’s very clean and quiet.

The design is three-wheeled, allowing it to be classified as a motorcycle in many states … and allowing it to use the commuter lanes. As for the performance?

With the All Electric Aptera, it is very easy to figure out the mileage range. The mileage is determined by the distance you can drive, under normal circumstances, until the batteries are effectively drained. In the case of the first Aptera typ-1e, we have calculated the range to be about 120miles.

With the Plug-in Electric Hybrid version of the Aptera(typ-1h) the mileage of the vehicle is difficult to describe with one number. For example, the Typ-1h can drive 40 to 60 miles on electric power alone. Perhaps for such a trip, the engine may only be duty-cycled for a few seconds or minutes. This would produce a fantastic number, an incredible number that, though factually true, would have no useful context, i.e. it’s just a point on a graph.

An asymptotic decaying exponential is an accurate way to describe the fuel mileage of the Typ-1h. For example driving say, 50 miles, one might calculate a MPG number that’s 2 or 3 times higher, say, 1000 MPG. As battery energy is depleted, the frequency of the engine duty cycle is increased. More fuel is used. at 75 miles, the MPG might be closer to 400 MPG. Again, we’re using battery energy mostly, but turning the engine on more and more. Just over 100 miles we’re just over 300 MPG, and just beyond 120 miles, we’re around 300 MPG.

So why pick a number at 120 miles? Well, it’s more than double of most available plug-in hybrid ranges that achieve over 100 MPG. It’s three times the distance of the typical American daily commute. It’s a meaningful distance that represents the driving needs of 99% of Americans on a daily basis. Sure, it’s asymptotic, after 350-400 miles it eventually plummets to around 130 MPG at highway speeds where it will stay all day until you plug it back in and charge it up.

And all of this for an estimated $30,000 sticker price! Read more of the Aptera facts and then think about driving one of these to work … I want one! The problem is that for now they are limiting the market to the LA area … bummer.

Web 2.0 Expo – Mobile 2.0: Design and Develop for the iPhone and Beyond

The second presentation at Web 2.0 Expo that I attended on Tuesday of this week was a great presentation by Brian Fling of Fling Media, and the creator of the Mobile Design blog. He started off with a really good review of all of the “layers” in the “mobile application” stack. There are a lot of complex pieces in place in the ‘telco’ world … the underlying technologies, the networks, the carriers, and the phone devices themselves. It’s really created a nightmare for software developers to write good mobile applications.

While at the CTIA show this last month, I learned a lot about the carriers, and their issues with software developers. The first was that software can often cause additional expenses to the end user, and if the user doesn’t like their bill they won’t pay it. On top of that is the second issue … the user will then call the carrier to be upset with them … and that support call can cost – on average – $20-$25 each call! So now they have an upset, non-paying customer on the phone to their support center costing them money … not good for cash flow. The one comment I heard was that two support calls a month makes you a non-profitable customer. And so the end result? The carriers want complete and tight control on the applications being deployed to cell phones.

Brian talked about the lock that the carriers have on the network, the phone devices, and the applications being deployed, and he then showed a great YouTube video of Jason Devitt (of Skydeck and Vindigo Studios) testifying before Congress about open access to the cellular networks. (Jason Devitt testimony before Congress) This video is really good to watch if you want to learn more about the limits imposed, if you aren’t aware of the current situation. It is, in many people’s opinions, completely crippling the creativity and value that could be delivered to your phone.

Some of the statistics presented were:

  • Current Global Mobile Users: 2.9 billion
  • Current Mobile Users with Web Access: 1.3 billion
  • Current Desktop Users with Web Access: 1.1 billion

This is fascinating as it shows that more people can access the Internet and the Web from their mobile devices, than from desktop or laptop PCs! The prediction that he quoted indicated that by 2010, half the worlds population will have Mobile Web Access!

Brian then presented his thoughts on Mobile as being the 7th Media:

  1. Printing Press (print)
  2. Recordings (audio)
  3. Cinema (video)
  4. Radio (remote audio)
  5. Television (remote video)
  6. The Internet (globally distributed and accessible)
  7. Mobile … with you all of the time!
    Apple iPhoneHe expanded on his thoughts in this area as he reviewed the real revolutions of Mobile Media:

    • Personal mass media – it’s your device!
    • First ‘always on’ mass media – most people leave it turned on
    • Always carried – you have it with you everywhere
    • Built in payment channel – via your phone bill, or browser
    • Point of thought – when ever/where ever – it’s just there!

    I was really impressed as he then talked about the fact that this is the only form of new media that can do everything the first six media types were able to do! Your mobile device can provide you with printed/text information, stored or streamed audio and video, and real-time Internet access … all of the time, almost everywhere you go on earth.

    From there, Brian moved into the value of location finally being realized. Location Based Services – LBS – are becoming reality due to the capabilities in new mobile devices. Although we often think about the possible privacy concerns … it is there. New forms of using Wifi, Cell Towers, and GPS are providing access to where you are … where your mobile device is. This then leads to the “Contextual Web”. Mobile applications are now able to begin to give you information based on when and where you are.

    All of this information and lead in was to set the stage for the rest of his talk about the current state of application design for mobile devices. Although the iPhone is completely changing the landscape, and pushing other mobile device vendors to wake up and move in new directions, there are still many “least common denominator” that any developer will want to consider when creating their mobile applications.
    His overall lessons about developing for Mobile devices are what he calls the Three C’s of the Mobile Web:

    • Cost (be conscious of the user … and their phone bill!)
    • Content (how you design the content for small device screens)
    • Context (how will the user use it … where they use it)

    This is where the presentation then moved to really focus on the iPhone … and why it is becoming the biggest impact in the mobile world … ever. He began by quoting Craig MCaw … someone who was instrumental to the development of the cellular market in the US:

    Change occurs because there is a gap between what is, and what should be – Craig McCaw

    Brian spoke about how he feels that the iPhone is the definition of Mobile 2.0. He believes that it is the first phone to truly integrate all of the necessary components to truly shift the future of mobile applications. When he attended one of the last Mobile 2.0 conferences, he commented on the 10 things that he learned:

    • Mobile 2.0 = the web
    • Mobile web browser is the next killer app
    • Mobile web applications are the future
    • AJAX is the next frontier
    • Rich interactions kill the web
    • Mobile User Experience sucks
    • Mobile Widgets are the next big things
    • Carrier is the new “C” word … they suck
    • We are creators … not consumers.

    Based on all of these things learned, it is interesting to see the way that many, if not all, are addressed by the introduction of the iPhone in the market. Brian’s thoughts in this area are the iPhone Strengths:

    • Smart phone for the masses
      • Although there have been many smart-phones produced, and sold into business, this is the first real “usable” smart phone. Period.
    • Flat-rate data plan
      • There is now no worry about “How much data am I using?” Users don’t have to worry about surprise phone bills. This, in my opinion, is the biggest value that Apple delivered.
    • Device sold and supported outside the carrier
      • Carriers are very worried about the cost of support calls. Apple is training users to contact the phone/device manufacturer (Apple) instead of the carrier.
    • No subsidization
      • This, in my opinion, is the second largest impact of the iPhone! People are paying hundreds of dollars for a phone … something that simply wasn’t done in the past with this class of mobile device owners.
    • Updatable software (4 updates in 9 months)
      • The fact that the mobile device can evolve and be updated easily by the user. In the past, few updates ever appears for the phones once they were shipped.
    • Location awareness
      • Huge. Integrated into the iPhone are ways to obtain information about Wifi access points in the area, and cell towers that are “visible” to the phone. Both of these methods are then used to attempt to fine-tune the users location.
    • Resetting bandwidth expectations
      • Here, the iPhone is giving the masses the first experience of broadband Internet access. It is setting a base threshold for expectations.
    • Portable device convergence (phone/ipod, etc.)
      • So many people didn’t want a converged device … because they wanted their iPod. The iPhone has now integrated digital camera, iPod, portable browser, and phone … all in one sleek device.
    • Web & Mobile standards (great adherence in rendering)
      • The Webkit browser engine used within the iPhone has an incredible level of support for the current HTML/CSS/Javascript standards allowing for rich application designs.
    • Impact on developer communities (developers moving into iPhone development)
      • The iPhone has ignited a real interest in people developing applications for mobile devices … specifically the iPhone.

    Some sales date that Brian provided:

    • 4 million sold in 90 first days
    • 2.5 million last quarter
    • 10 million by the end of the year … they are at 70% of that now!
    • 1457 Mobile apps as of yesterday
    • 131 new apps in the last 14 days
    • 600 new apps in the last 60 days
    • 84.8% of all iPhone owners are accessing news and information
    • iPhone is already 23% of all mobile web traffic
    • Wired magazine showed the iPhone as .09% of all web traffic
    • Brian showed the iPhone web stats report … March 2008 now at .15% of all web traffic

    Now, besides the iPhone, the newer Apple iTouch is becoming a new transformation of the iPhone. It is a WiFi media device that can also be considered a powerful mobile device … containing the same Webkit browser engine … allowing iTouch users to cruise the web via Wifi. He asked people to think about how quickly the iTouch might begin to replace the more common iPod … there were 22 million iPods sold in Q1 of 2008 … over 140 million sold in 6 years.

    For the more technical web application developers, this is where Brian finally cut to the tech details. First tip … you really have to design with bandwidth in mind. For those who never developed for dial-up modems, this is the time for you to experience the “old days” of the Internet. In his testing, Brian said you really have to consider that the average max bandwidth that an iPhone user will see is ~125kbps on Edge … and even on Wifi, you *might* see rates of ~4Mbps … but design your mobile device applications for 125kbps! Keep the content light!

    There are also two key features that you are going to want to program into your website:

    • Device/Browser Specific Stylesheets
      • Use the “media” queries (slide 268 above)
    • Device Detection (Javascript redirect or Server Redirect)
      • userAgent matching or server side (slide 270 above)

    Also, when developing applications for the iPhone, he mentioned that there are several specific limits that Apple has imposed:

    • 10MB download limit
      • no bigger objects!
    • Javascript execution time limit – 5 seconds for each top-level item
      • You can’t have any long-running Javascript routines.
    • No Flash or SVG
      • This is still disappointing to me … I really like both!
    • No Java
      • I’m not a Java fan, so no biggy to me.
    • No mouse-over, hover, tooltip mouse events
      • The biggest impact here is drag-and-drop, etc.
    • No file downloads or uploads
      • Again … limits what you can do to get data to and from the iPhone

    There are now several development kits and tools that can be used, and Brian said that all of these have use with iPhone applications:

    • iUI by Joe Hewitt
    • Google Web Toolkit (GWT)
    • Yahoo! User Interface Library (YUI)
    • many other icon and interface design tools

    For the PHP crowd out there, Brian again reiterated the importance of bandwidth … and size.  He stressed that you really want to reduce content size:

    • use ob_start(“ob_gzhandler”) for PHP content
    • remove whitespace
    • refactor Javascript to be efficient, remove extra brackets, etc.
    • compress images or use CSS where possible
    • cache data on the server for good response times

    Overall, what was really fascinating to me, was that he continued to recommend using HTML and AJAX for iPhone applications … for now.  The real issues of creating an installed iPhone application still center on the uncertainty of you will ever get your application installed on the phone!

    Apple has taken the approach that, although there is an open developer kit, the actual ability to get an app to a user will be limited initially … and it’s still not quite clear all of the implications.  If you simply develop an application on your server that can detect the iPhone browser, you can then deliver a very customized experience to the user with great success.  No need to install anything …

    Obviously the limitations of this approach, are that you will loose the tight integration with many of the iPhone features such as the location data, and the accelerometers.

    I’m now looking to obtain a couple of used iPhones for testing … or maybe an iPod Touch.  I want to see what we can come up with for a good iPhone UI for one of our recent projects.  This session was great for learning some of the more indepth aspects of what to consider when developing for the iPhone.

Web 2.0 Expo – Your Digital World – Meshed!

I’m sitting in the Web 2.0 Expo session on Microsoft Live Mesh where Ori Amiga of Microsoft is doing a demonstration of the current solution. I’m slowly getting a better idea of what they are doing, and how this all works.

Live Mesh PreviewOri started off by logging into his Live Mesh account on-line. He showed where he manages his “ring” of devices … the devices that are to be included in his synchronization cloud. He had PCs, Media PCs, an Ultra Mobile PC, his cell phone … and his Macbook. He was able to navigate the “ring” and view the status and information about the devices and their connectivity.

The other core part – to me – that he demonstrated was the “Live Desktop”. In the portal, you actually get a “Live Desktop” that is like a basic desktop in the cloud. There, you have a variety of tools and applications that you can use to interact with your Mesh, and your shared/sync’d resources. He showed where he could create a shared folder on the Live Desktop. He could also see various event streams in a small desktop client application showing updates and changes to the various shared resources. What was cool was that he then flipped to looking at his desktop, and had all of the same capabilities … the shared folders were on his desktop of his laptop, and he had the same client application that allowed him to see the event streams and contents of the mesh resources.

Ori then went through a full demonstration of photos being shared, through the mesh, across his laptop, a Media PC, his phone, and his Macbook. He took a photo with the Macbook that sync’d everywhere … and photo with his cell phone that sync’d everywhere. This is a very simply demo, but what was impressive was the underlying protocols and engine being used to provide this capability. The “shared folder” is actually also “feed” that can be subscribed to, and that has enclosures that link to the associated file objects. Updates to the local “folder” on one machine cause POST updates to the others in the cloud … and then synchronization occurs by the other remote devices pulling from the feeds on the local device.

The presentation then went into a little more depth on the “feeds” and the models that are being used to define the architecture. I’ve added some note below on this portion … but to me I am thoroughly impressed as where Microsoft has taken the concepts of “feeds” … then have created a complete Resource Model around feeds … and an engine to leverage this model and provide the core mechanisms. The pluggable architecture of the Moe (Mesh Operating Environment) allow for the feeds to be rendered in a wide – and completely extensible – range of formats … RSS / ATOM / XML … and anything to come.

In a later demo, Ori showed where a web property can have an “Add to my Live Mesh” button, and so a web user can then add that site/application to their Live Mesh. The demonstration was shown using a SilverLight and he was able to then run the application locally and have all of the same capabilities of the hosted application. He even made a comment about a photo on the local application, and the comment than appeared on that same photo on the hosted application. The idea here is that the usage can become very transparent to average users.

Ori also showed a very specific client application that was written to use the Mesh APIs, and provide even more real-time data synchronization capabilities to keep two different complex data sets in sync. The demonstration was a “family tree” application, and as he updated photos and names, the other “remote” application reflected the updates in real-time. I know that here he was no doubt showing that there can be tight integration with the local Moe to subscribe to events of mesh resource updates.

Upon leaving the session, I immediately went and complained to the Microsoft folks about the problems I was having with my preview account sign-up. They jumped to action and sent me down to the Microsoft booth … where I could again reproduce the same problem. I was really getting bummed … I wanted to experiment with all of this.

For the lunch break I sat down on the 3rd floor, and was about to eat when I noticed some friends from Microsoft at a nearby table. I went over and again told them about the problems I was having … it turns out that Amit Mital was standing behind me! As the Mesh General Manager, he immediately asked me for my Live User ID, and sent off an e-mail … as I finished my lunch he came back over and told me it was fixed! Sure enough … I logged and am now Meshed!

Here are a couple of other thoughts …

Key Elements of Live Mesh:

  • Resource Model
    • The entire resource model was reduced to the idea of a “feed”. This is identical to an RSS or ATOM feed. It can contain DateEntries and those can have Enclosures.
    • There are DataFeeds, NewsFeeds, DeviceFeeds, MembersFeeds … it is extensible.
    • A Mesh Object can then contain one or more feeds.
    • Mesh Objects can be viewed … in a feed … of course!
    • Moe – the Mesh Operating Environment – can then render the feeds in any format … and can also be extended with new renderers.
    • The local client can be directly accessed to get all of these feeds in the mesh! e.g. http://localhost:2048/Mesh … so feeds are automatically – and symmetrically – replicated to all devices in the ring.
  • Cloud Services
    • Ori didn’t spend a lot of time on the Cloud Services …
  • Client Runtime
    • Upon adding my laptop to my “ring” there was a 1.5MB downloaded client. This appears to be the Moe, and has added a new icon to my tray. Through this little tray icon I can open the Mesh Notifier which shows me my device “ring” and recent activity. I didn’t have to reboot to get this going … woohoo!
  • MeshFX
    • Although he mentioned this, he didn’t spend a lot of time talking about it.

The breif discussion about the Mesh Developer Stack really came down to his Core Tenets of their efforts.

  • Core Tenets:
    • Open (Protocol Based)
    • Resource Oriented
    • Consistent S+S Programming Model
    • Extensible

I have to admit that I am impressed with what I am seeing so far … it’s really a cool direction for Microsoft to go, and I can only image all of the applications that might begin to appear. I’m disappointed right now that the developer info is most likely going to be out at PDC – the Microsoft Developer Conference – in the fall … I want it now! I’m going to ask around and see what I can find out … I’ll be blogging about this … 🙂

Web 2.0 Expo – Microsoft Mesh

I’m sitting in the keynote at the end of Day Two of the Web 2.0 Expo.  Microsoft’s Amit Mital, Mesh’s General Manager.  He started off by talking about the fact that we all have more and more Internet attached devices – from PCs to laptops, to mobile phones – that all have Internet connections.  He then commented on how all of these devices do not yet share all of your information … they don’t seem to talk … communicate … or share information.

Live Mesh is a Software + Service platform that provides services for devices to become aware of each other, and to share data and information. He showed the “trailer” about Live Mesh.  In the example, he showed where a photo being taken on a smart phone can instantly and transparently synchronized to other phones, PCs, XBox, and even showed “dad” using a Macbook at the airport.  As the phone took the photo and saved it, the photo was immediately synchronized to all of these other devices and made available.  But he stressed this is not just about photo and file synchronization … but is a core engine that provides the mechanisms for the communications between devices using standard protocols … with NAT and Firewall traversal … and peer to peer capabilities.
He then showed a quick look at the “device ring” … this is where you add devices into your mesh.  You can then add “folders” where you can put files.  You can invite others to share via e-mail.  There are then feeds for the mesh, and folders, so that you can see events of what is going on in the mesh.  Data and applications are always available.  Resyncing your data into a new device … is simply adding into into your “device ring” …

His entire presentation was quick … and was accepted with mixed reaction.  I’m looking forward to playing with Live Mesh, and writing more about my experience and what I learn.  I’ll attend some sessions tomorrow or Friday … and will post more then!

P.S. They are giving us a URL to download the early access code … I can’t wait!  🙂