Apache and IIS on Windows Server 2003, Part 2

I made some progress, and learned more about installing both of these
applications on Windows Server 2003.  First, if you use the
command “netstat -an” you can see the ports that are in use.  I
actually knew about netstat, however it was the “-an” option that
revealed the core information.

That piece of information was that both Apache and IIS both claim port
80 – in its entirety – across all IP addresses … no matter what the
settings in the various configuration files lead you to believe. 
Upon reading the documentation about Apache on Windows,  there is a note that is not completely clear:

Because Apache cannot share the same port with another TCP/IP
application, you may need to stop, uninstall or reconfigure certain other
services before running Apache. These conflicting services include other WWW
servers and some firewall implementations.

I would not have believe that this is the case even on different IP addresses. 
But this is the case.  So there is no way (that I can find) to
install both of these products, on different IP addresses, on the same
machine.  Done.

Ok, so then what is the solution?  I have now found two ways to
resolve my situation.  First, I experimented with creating a
virtual server in IIS that is simply a redirect to Apache on a
different port.  So I installed Apache on port 10.0.0.5:81, and
then configured IIS on 10.0.0.5:80 with a redirect to
10.0.0.5:81.  This worked!

The second solution is even better for my purposes.  I was
actually installing this to get a “JAM” application … Java, Apache,
MySQL … installed.  The actual configuration is for Apache to be
the web server, it uses mod_jk to connect to Tomcat, and the Tomcat is the Java container.  The Java application is what accesses MySQL through JDBC.

As I was reading on the configuration of  Tomcat I found that
there is an IIS version of mod_jk!  The mod_jk
isapi_redirector.dll will allow me to directly connect IIS to Tomcat
without requiring Apache … duh!  I should have figured that
someone would have written such a connector.

I’ll update the progress on Monday or Tuesday … I downloaded the
components, but ran out of time to get this new method installed and
working.

Installing qmail

I’m going to be writing a series of posts that detail my experiences in
installing a new mail server on Linux.  I have been running a mail
server called the Mercury Mail Transport System on Novell NetWare for a long time.  The NLM version of Mercury
has been robust and works … although it is lacking in some more
recent innovations in e-mail systems.  On top of that, I want to
get rid of my NetWare servers … they just aren’t what I want to be
running any more.

I have installed a server with Fedora Core 2, and as of this weekend I
finally dove in and began the actual installation of the mail
server.  After a lot of looking around, I chose qmail – “Second most popular MTA on the Internet” – and I also wanted to add the TMDA anti-spam solution.

I read through the qmail installation instructions and have to admit
that I was a little worried … until I found the “lazyinstaller for
qmail” at lazyinstaller.net
This is one amazing script, and it made the entire process a
breeze.  Once I had the script on my machine, I simply edited a
few parameters to define my primary domain, some paths, and a few other
items.  (NOTE:  I noticed later that I could have used their
on-line generator to create my customized script ready to download!)

Once I had customized the lazyinstaller script, I ran it and was
impressed.  It downloaded all of the source tars, unpacked them,
built the projects, customized configuration files, and set-up qmail
complete with smtp, pop3, imap (both SSL and non-SSL!) and web-based
administration tools.  There was only one error in the script that
I ran (v2.0.2) where a directory was not created for binqimap … I
created the directory and copied the contents of the config file from
the script into the new directory.  At the end of the install,
there was a short note on creating the start-up and shut-down scripts
… and I was ready to go.  I started up the services, and
everything has been running smoothly!

I have already started testing with some virtual domains, and
everything seems to be working fine.  As of tonight, I installed
TMDA, and have now started my testing with that.  I just completed
the first tests there, and it’s working great.

I have a total of ~15 mail domains with 40-50 users that I have to move
to this new server.  I’m looking forward to moving one of those
tomorrow … I’ll post more about my success!

Novell NetDrive … a dying product?

I have been using NetDrive
(http://support.novell.com/servlet/filedownload/uns/pub/ndrv41862.exe/)
for years now, and it is a very innovative piece of software. It
completely alters the way that people use FTP to transfer files …
making it as easy as “mapping a drive”. With NetDrive I can “map”
a drive letter, say “N:” to my FTP server on the Internet. I can
then “drag and drop” files just like any other drive on my system.

The real issue with FTP is that it is not the most secure protocol that
you can use. Most Linux and UNIX users are using SSH and SCP
instead. SSH is the “secure shell” and combined with SCP, a
“secure copy” it allows you to access your remote boxes through an
encrypted connection. I use both of these all day, and what hit
me was that the usability of SCP – even using WinSCP – is not equal to
that of NetDrive.

I started to check and see if Novell had released a version of NetDrive
that would use the secure protocols, and found that I could not locate
any newer versions of NetDrive! The last one I can see if from 17
Apr 2003! And there is no apparent work on a version that
supports SSH/SCP …

Here they have a very powerful tool that could be used to “seed” the
market and alter how people access Linux from Windows … branded with
Novell’s name … and they seem to be letting it die.

Novell … drop it into Open Source … or update it! You are again allowing a valuable beachhead to disappear …

The power of MRTG …

The Multi Router Traffic Grapher (MRTG)
is an elegent piece of Open Source software. It is amazingly
simple, yet powerful … a great combination. I first became
aware of MRTG years ago when working on network management
software. The foundation for a lot of network management and
monitoring is the Simple Network Management Protocol (SNMP)
protocol. MRTG was designed to provide trend graphs of SNMP
variables that were being polled. Well, it actually started as a
tool to graph some specific variables – the Interface statistics of
data going in and out of a network Interface.

What is great about MRTG is that is was then extended to go beyond it’s
roots … and into a couple of different directions. The first
area that I really like is that I can add scripts to MRTG that return
values to be graphed … anything that I want. You can only graph
two variables per graph … but it can be any type of data.

I have now written a variety of MRTG scripts to scrape web interfaces
for a variety of devices and applications. For example, I wrote a
MRTG script to scrape the status screen of my ActionTec GT701 DSL
modem. With this ActionTec MRTG script I can now see up to date trend graphs of the traffic going through my DSL modem.

Another example is this NoCat MRTG script that I wrote for the NoCat
project – an Open Source network authentication application. It
also scrapes the web page generated by the NoCat Gateway
software. In both of these examples, I am able to extend the
functionlity of MRTG using Perl and wget …

Now I’m also using MRTG as a primitive OLAP tool … to graph the
results of queries to a MySQL database. In the backend systems
that run our wireless network – 80211.net
– I am writing records to a SQL table to track our sales of Internet
Access. I’ve now written a quick Perl script that does a query of
the database, finds all of the records of sales this month, and then
calculates the revenue that has been generated … and outputs it in
the correct format for MRTG. And so now, I have several graphs
that show our month-to-date sales so that I can see our progress each
day … and throughout the day. What is interesting is to be able
to see the trends of when people purchase Internet Access …

MRTG allows me to easily visualize any type of information … in a very simple and elegent way.

When cameras are everywhere …
This is an amazing article, with a link to a web site that shows just how advanced criminals are becoming … and how they are leveraging technology.

The concept is simple as described below … what is wild is that they are using some fairly simple technologies to accomplish this. Just the other night I saw an episode of Law & Order where a high school student took pictures of other students in the gym locker room … with her cell phone … and then sent them to other people. I hadn’t even thought about the portability of these “wireless cameras”. This all makes me think about where we are heading when miniature cameras can be carried and left just about anywhere. And people are thinking that we can protect privacy?

ATM Skimmers with Wireless Cameras, Pickups. Automated Teller Machine customers now robbed wirelessly without knowledge: The University of Texas at Austin police have a compelling page that shows how a skimmer (which scans ATM cards before they’re inserted into the ATM) and a wireless camera in an innocuous position nearby can steal a card and the PIN. The skimmer reads the magnetic stripe; the camera can see the PIN being entered. The thieves park nearby and retrieve the information wirelessly. This is reminiscent of last month’s story of a wireless Israeli post office money heist. It may be just me, but after years of being warned about shoulder surfers in the 1980s and 1990s, I often cover my hand when entering a PIN on a phone or ATM. I guess my paranoia pays off. Also, I only go to one bank’s ATM machines, which are uniform. I think I’d notice a weird add-on…. [Wi-Fi Networking News]

Freenet still alive and kicking …
When I first read Ian’s papers about Freenet (quite a long time back) a group of us immediately set up nodes for testing and experimentation. It was very crude back then, and several months ago I even stumbled on one of my old NetWare servers that still had the directory structure and files. It was good to see this update and to see that Freenet is still making great progress. I just downloaded it to see about getting it up and going again. It appears to have come a log ways …

Freenet Project More Stable, In Need [Slashdot]

The Operating System Monoculture dilemma
It is often fun to speculate and point at problems … the solutions, however, do not always come easy. This article is about the issues surrounding a paper written about the “Windows Monoculture” … proposing that so many people are running Microsoft Windows products that a single major flaw could be discovered that causes massive damage (to the entire human race?) when millions of computers are effected.

There are a number of “flaws” with this model, although it points at some potential issues to be learned from. One thing is that no real solution is outlined … and the “obvious” solution is that the world ought to be running on tens or hundreds of different operating systems to solve this dilemma.

Replacing one ‘monoculture’ with a different ‘monoculture’ is not a solution. So having GNU/Linux dominate the earth would simply spawn a new group of “anti-GNU/Linux” people who would call that wrong, and create their alternative. There are only two real ways out … to create something within the technologic substrate that is superior to what is possible in the biologic substrate … or to have a large and diverse number of operating systems.

I actually think that what we are going to find is that the technologic substrate will allow for the emergence of entities that far exceed the capabilities of the biological world that we are a part of.

Warning: Microsoft ‘Monoculture’. A security expert warns Microsoft’s dominance of software is a set-up for global disaster — and promptly loses his job. His comparison is to biology, where species with little genetic variation are vulnerable to catastrophic epidemics. [Wired News]

Autonomic tools from IBM … the coming abstraction
I found two articles recently that cover the release of the IBM tools for Autonomic computing. Even if you are not interested in IBMs tools, there is a lot of very good reading about its core concepts.

The article below, and this NWFusion article both give a brief overview of what IBM released, and contain links to where you can download the tools or read more about them.

I do believe that they are introducing some powerful models for developing software that are able to exist in highly-distributed networks, and that are able to deal with failures effectively. Much of this is accomplished using some very simple concepts.

There are several of the same areas that we have been exploring with our web services work, and our application substrate. I really like their Installation and Deployment model as it mirrors much of our same functionality … there might be some aspects that we embrace. All of this continues to support a growing abstraction above the operating system.

IBM delivers autonomic tools. Big Blue packages up the results of its research into self-managing systems with an open-source toolkit that plugs into the Eclipse development set of software. [CNET News.com – Front Door]

More hope for less spam … soon …
This appears to be some good momentum in the anti-spam area, as a good first effort to combat the problem. There are no doubt other proposals and standards that will emerge.

This specific solution will force companies to define their mail servers in DNS in a way that allows them to be held accountable for spam. This will provide a way to deny e-mail from being received, if the source of that mail can not be tracked down. It’s a very good start.

eWEEK: New Anti-spam Initiative Gaining Traction. A grass-roots movement to improve the SMTP protocol that governs e-mail traffic is gaining acceptance, and its lead developer hopes to get fast-track approval by the Internet Engineering Task Force to make the emerging framework a standard. [Tomalak’s Realm]

More automated video security
I have always enjoyed working with video. There are numerous ways that it can be used for entertainment, and also for applications like security. This is a very impressive suite of applications for video security.

As PCs and their web-cams are becoming more cost effective, software suites like this can now be used as extremely ‘intelligent’ solutions for monitoring a home or business. This software has the ability to detect motion on any of its cameras, and then begin to record to generate notifications. What was really impressive was that it even supports multiple zones to monitor within a single camera image. The screenshots give a more detailed explanation of the features.

As I get some time … I might give this a try. I have some ideas on what I can do with something like this …

ZoneMinder 1.17.2. A Web-based video camera security, motion capture, and analysis suite. [freshmeat.net]