Configuring Radio Userland to mirror posts to Blogger.com

The Radio Userland Atom Bridge Tool is designed to allow anyone using Radio
Userland an easy way to mirror posts to any Blogger.com or Blogspot.com
blog, using the Atom API. This tool is based on prior work from
other developers – Dave Winer and Steve Hooker – who wrote the
ManilaBloggerBridge and xManilaBloggerBridge respectively.

The installation is straight forward and should require very little
effort overall. There are a few key points to understand about
how this tool operates:

  • Once it is installed, you will configure the tool using the Radio Userland Tools link on the Home Page.
  • The tool will not mirror posts
    from your blog home page to Blogger … only posts to categories are
    mirrored. This is on purpose so that you can selectively post to
    your Blogger blog by using categories. If you want a post to go
    to your home page, and to a Blogger blog, then simply create a category
    and always check both your home page and that category when posting.
  • This tool ought to work with any other Atom API compliant
    server. Give it a try, if it fails then come back here and
    comment or complain. If you provide enough information I’ll see
    if I can take a look at it and get it working.
  • NOTE:
    There is still one dependency on another tool from Steve Hooker – the
    backLogAllRSS Tool. You will have to go to his backLogAllRSS web page and download a copy
    of this tool, and install it in Radio.

Ok … so how to get started. First, go and download the RadioAtomBridge Tool.
Once you have it downloaded, copy it into the Radio UserlandTools
folder on your computer. On Windows systems, this is usually the
Program FilesRadio UserlandTools. I’m not real sure where this
folder is on a Mac. When you copy the tool into this directory,
give it a minute or so and the tool will be installed.

Next, you’ll want to go to the Radio Userland Home Page, and then click the Tools
menu. You ought to see a new tool called the
RadioAtomBridge. Make sure that the checkbox to the left of the
name is checked. If it is not, then check the box to enable the
tool and Submit the page. If the tool is enabled then the name
RadioAtomBridge will be a highlighted link … click it.

You will now be presented with the configuration page. You will
see each category and a checkbox where you can enable that category to
be mirrored. The first three configuration fields are already
configured for Blogger.com, so you only need to update the Blog ID,
your Blogger username, and password.

To get your Blogger Blog ID, go to Blogger.com and log in. On the
Dashboard, click on the name of the blog that you want to configure in
Radio. If you now look at the URL in the address bar of your
browser you will see a URL like:
http://www.blogger.com/posts.g?blogID=99999999

… your Blog ID
is the number at the end . Your Blogger username and password ought to be
obvious.

Lastly … do not check the box about Manila sites. I’m probably
going to remove this option as it is an older feature of the
ManilaBloggerBridge.

Once you have entered all of these settings, Submit the page and your
changes will be recorded … and you’ll be ready to start
posting. Go back to your home page … write a post … check the
category that you configured. After you have posted, go to the
Radio Userland Events page and look for the indication that the post to Blogger occurred. You can also go back to the RadioAtomBridge
page and scroll down to your category. There should now be some
statistics about the post, and possibly an error message if there are
problems.

So far it works for me … I’m going to keep testing and might have
some updates. Good luck and I hope that someone finds this
valuable!

SCORM and eLearning

In my new job at “Agilix Labs” I have been introduced a lot of new –
and unknown to me – electronic learning technologies. We have
recently partnered with Blackboard, one of the leading creators of e-Education software and systems. I have also been educated about WebCT, another leader in this same space. In the Open Source community, there are also Open Source solutions like Sakai that are gaining ground at various higher education facilities.

Overall, I had no idea that so much was going on in the automation and
computerization of education systems. Of course it only makes
sense, but it is the extent of it – and the growing maturity – that I
was oblivious to.

Today I was quickly educated about SCORM
– the Sharable Courseware Object Reference Model. Amazing.
There is a good SCORM “brief description” here. It is actually a rich specification for the creation of courseware –
educational software – that includes the course material, coupled with
exercises and exams (assessments), and even some metadata about the
“flow” of the course – the order that students have to accomplish
different parts before progressing, and even scores that must be
attained – along with where to send the results.

I had my first demonstration of SCORM today in the form of a government
course being given by the Navel Postgraduate School. It was
pretty cool … a .zip file contained the entire SCORM course
(something on marine navigation) and once loaded into Blackboard there
was all of the course material, the exams, and for the student a way to
begin learning.

Photon Processors already?

Along with other accelerating aspects of technology, of course
processors have to keep in step. There has been a recent trend
towards “mulcoth” architectures … multi-core, multi-threaded.
This is where a single “chip” that is placed in your computer might
actually contain “multiple cores” (multiple processors) and that these
processors would then provide “multi-threading” or “hyperthreading”
within them. What this give you is the equivalent of a
multi-processor machine inside of that single chip.

Much of the progress towards “mulcoth” processors was due to heat
issues as we continue to push up the speed of processors. So
instead of one giant “even faster” processor, you’ll be using machines that have
multiple cores … or multiple processors … running at current speeds in that one chip in your
computer.

I caught this article that shows where we are going to begin to track
the progress of newer approaches to breaking the speed limit on
processors. Photon Processors.

Look out … the future is coming quickly …

Production of Photon Processors Expected in 2006 [Slashdot:]

Towards the 1TB (Terabyte) disk drive

Wow … the doubling continues. There are two key points that I
like about this article. The first is that 500GB PC disk drives
will be on the market this year. That is now 100,000 times the
capacity of my first hard disk drive that I ever had! The 100,000 times growth in capacity has occurred in less than 30 years.

The second key point is that it shows no sign of stopping. From this article:

Desktop drive capacity will top out at around 1 terabyte by late 2006,
before running into technological problems in maintaining data
stability.

We are on track to again double by the end of next year! It is
difficult to image that common, and eventually commodity, hard disk
drives will reach these sizes. That is a huge amount of
data. In addition, as humans we will simply solve the tough
problems and continue the growth rates with newer technologies.

A last point is that there are currently numerous solutions – the most
popular referred to as RAID – that allow you to aggregate multiple disk
drives into a redundant array that appears as one, even larger, disk
drive. I have been tracking solutions like this over the years to
mentally track the cost of a terabyte of storage. I am now
stunned that in the next year, I might be buying one terabyte of hard
disk storge in a single PC disk drive.

PC World: PC Drive Reaches 500GB.
Demand for greater capacity continues to rise due in large part to a
growing need for music and video storage on PCs and consumer
electronics devices. To meet that need, storage vendors are turning to
new recording technologies. The first of these, perpendicular
recording, will debut from Toshiba this year. [Tomalak’s Realm]

Anti-Spam and what to do today

It seems that anti-spam is, of course, all about verified
identity.  I really liked this article, and it got me looking into
what I can do on my mail servers today.  I realized that there are
things I can do when I read this quote:

Reports indicate that as much as 50 percent of sending domains are
authenticating their outbound e-mail using SIDF and signatures.

Wow … am I behind the times!  I went to Google and did some searching.  I found a great blog talking about SIDF where there are the following links describing where to read more and what to do.

The first link is the Microsoft SenderID page that has a lot of information.  It also has a link to this SenderID Overview Presentation that gives a great overview of the concepts and how it works.  The last link is to the Microsoft Sender ID SPF Record Wizard … which will assist in creating the actual DNS records that you have to configure.

All of this is oriented towards telling the world that your mail server
is the legitimate source of mail for your domains.  Time to add
more identity information about my mail server into DNS … I’m
creating my SPF Records right now …

E-mail authentication. Then what?. Sendmail CEO Dave Anderson explains why we’re approaching the end of e-mail as we know it. [CNET News.com]

Up2date e-mail notifications

A while back I had started to experiment with a way to get e-mail
notifications from my servers when up2date detected that new packages
were available. I am running a series of Fedora Core 1,2, and 3
boxes and it seems that the updates come quite frequently.

I decided that this weekend I would sit down and write a new bash
script that could be run daily by cron. Here’s what I wrote:

#!/bin/bash
# First lets check with up2date …

# have up2date list the available packages …
up2dateOutput=`up2date –nox -l`

# now check to see if packages are listed …
firstUpdatePackage=`echo “$up2dateOutput” | grep -A1
“\-\-\-\-\-\-\-\-\-\-” | awk ‘{if (NR==2) print $1}’`
# take the
output,
# grep for the long hyphen divider
# grabbing that line and the
next line,
# awk the second line to see if there
# is a package name at the
beginning

#echo “First package: |$firstUpdatePackage|”

if [ ! -z “$firstUpdatePackage” ]
# there is a package name
then
  #echo “Sending e-mail …”
  nodeName=`uname -n` # get the host name
  mailSubject=”Up2date – “$nodeName
  # create the e-mail subject line
  `echo “$up2dateOutput” | mail -s “$mailSubject” root`
  # send the e-mail to root
fi
exit 0

So far, it appears to do exactly what I had hoped … an e-mail notice
when there are packages that can be updated on my servers!

C#, .NET, and Visual Studio … amazing.

I have to admit that I am once again amazed by the power of
Microsoft. I just completed my first Microsoft training course in
a long time … to learn the C# programming language. It was an
awesome experience.

I have a long background in developing software, starting with assembly
language, Fortran, Basic, C, and other languages. I never really
moved to Java, but knew that I wanted to learn a current object oriented
langauge. Over the last several years I have learned both Perl
and PHP, and these are impressive Open Source languages. When I
saw that the Mono project was getting going last year, I immediately
reealized that C# was the language to learn … from a C programmers
perspective.

After three days in class I now have a good understanding of C#, which
I plan to use for both Windows and Linux development. The Mono
project is the Open Source project to bring C# and .NET to Linux …
and obviously Microsoft has C#, .NET and their development environment
Visual Studio well established and moving forward. I will be
looking at Mono, but I realize that they have their work cut out for
them … Microsoft’s development environment is impressive.

I have developed in Visual Basic 6 on Windows for a long time, and I
found this to be a spectacular solution for developing Windows
applications. I was able to rapidly create a wide range of
applications over the years, complete with installers, with very little
effort. With all of this, I was spoiled when I had to deal with
text-mode development in Perl and PHP. I was really waiting for
this C# training … knowing that it was going to leverage a lot of the
same technologies.

Some of the core areas of Microsoft’s solution that I was most impressed with:

  • Visual Studio .NET 2003 –
    this is a very impressive Integrated Development Environment (IDE)
    solution. They have done a good job allowing for a lot of
    customization of the development environment. Once I had my
    desktop arranged, it was easy to flip between the visual UI designer,
    and the various code modules. Help was always there, and the
    Intellisense code completion was great. I admit that I wish it
    would complete using the tab key, instead of the Ctrl-Spacebar they
    require, however it is invaluable.
  • Database Connectivity and Development
    – it is beyond easy to develop complex applications that access a wide
    range of databases, and data sources. Within the Visual Studio
    IDE, most of the development can be done using wizards and simply
    dragging and dropping database tables from the Server Explorer.
    All of the code to integrate the data sources, and databases, into your
    code is just written for you. You end up being able to use the
    DataSet wizard to then create the DataSet. Again … all of the
    code is basically written for you … and you are left to focus on your
    core logic and functionality.
  • XML Manipulation – so
    far, I haven’t found anything that I can’t do with XML. In almost
    no time this morning, I was able to program an HTTP request to grab an
    RSS XML file from one of my blogs. I was then able to transform
    this XML file into a DataSet with one or two lines of code. From
    an XML file, to a set of database tables ready to be read.
  • SOAP Client – ok … now
    this was just too easy. I simply located the URI for a web
    service that I was interested in. I actually searched the
    Microsoft UDDI directory through the integrated browser. I found
    a stock quote web service, and clicked the link to add it to the
    project. The next thing that I know, I simply have a new service
    with a couple of new methods that I can call. I then link the
    results of the SOAP request to a DataGrid … and can view the results.
  • SOAP Web Service – now this was just too easy. I simply
    went through a Wizard to create the base service class, and then added
    a series of methods that immediately become web-services methods.
    As I added each new method to the class, the build process seems to
    simply re-deploy and everything worked.
  • ASP.NET – now this was the final aspect of the course that I
    received today. I am absolutely blown away at hope simple it is
    to create complex sets of interactive pages.

Now … I am completely open to Mono and very interested in it’s
success … however I now have a model product that they are going to
have to beat. Visual Studio and Microsoft are at the forefront
of development with their offering . I’ll post about my
further experiences … and my experiences with Mono!

Fedora Core 1 upgrades and Sendmail

I have slowly been upgrading all of my old RedHat boxes to Fedora Core
1.  I know that this is even old, however this is a tested
configuration for what we wanted to do on our wireless network
infrastructure, and there are some known problems with moving to the
v2.6.x Linux kernel.  I don’t want to deal with those yet.

I have now done three upgrades, using the anaconda installer that comes
with Fedora Core, and I have to say that I am impressed.  It just
works.  Except for Sendmail.  In each install that I have
done, sendmail just stops working, and begins to emit useless errors
into the log … or at least they are useless to me.  On this
latest upgrade, I have spent hours of time debugging the installation
over the last two or three weeks.

Today I was able to find a simple solution to debugging these
issues.  I’m not sure why I didn’t think of this before.  I
simply used “rpm” to erase/uninstall sendmail … and then used
“up2date”  to install it again.  Jackpot!  Sendmail is
now working on this newly upgraded server.  I’m not going to
forget this “solution.”

Wow … it’s almost like rebooting Windows!

GetRight … segmented downloading like BitTorrent

I have been using GetRight for a
long time. It is still, IMHO, one of the most amazing download
managers that has been written. It is to downloading, what ICQ is
to IM … the ultimate download manager with options and features
beyond what the average person could ever use.


Tonight, I was using it’s “mirrors” capabilites, and realized that it provided a “torrent”-like capability long before BitTorrent
was around. GetRight allows me to click a link in my browser, and
select for GetRight to handle the downloading. As the download
starts, I can then go and visit other mirror sites for the same file,
and click those links also. GetRight will automatically notice
that it is already downloading that file, and start a new connection to
the new source server … and split the download into “segments”.
In this example I am downloading four segments of the Fedora Core 3 CD
#4. I simply went to four different mirror servers and clicked
the link to download the same file from each one. GetRight
handled everything else!

It is intelligent software like this, that probably contributed to
ideas like BitTorrent. In this case, I am able to leverage the
various mirrors that exist to increase me download bandwidth …
without requiring things to be in a BitTorrent format. It’s funny
that I have been doing this for quite a while, but failed to think
about the similarity to BitTorrent.