Monday, December 11, 2006

Installing CentOS 4.4 under Virtual PC 2004

CentOS is a free version of RedHat Enterprise Linux which closely mirrors the commercial version. There are many places in the 'verse which detail changing the default color depth from 24-bit to 16-bit for the S3 Trio adapter that VPC emulates. After doing this I tried 'startx' only to find it wouldn't work so I started digging.

The X windowing environment started up then flashed and shutdown again with a console error of unable to find .xterm'. Checking around, I found examples from one cranky coot (a.k.a. Jason) under the heading 'Get GUI'. Turns out CentOS uses a package manager called 'yum' and he lists the commands for installing X11. I got as far as the the fonts before getting errors so I figured I fire it up again and see what happened. I did get a desktop but the size was way too big and unusable.

Jason also showed commands for installing XFCE which is a lighter and faster X-windowing system. So I tried that, typed in 'startxfce4' and voila! I then tried browsing the web only to get an error about Firefox not being installed. No problem - I opened a console window and tried 'yum install firefox' and it worked.

Hope this helps an old Windows developer like me ;)

Original post

Friday, November 3, 2006

Changing an install directory dynamically in WiX 2.0

Somehow this topic seems so common place and yet no where have I been able to find it spelled out clearly and simply. This reminds me of the joke about a helicopter pilot who's lost in the fog near Seattle, asks where he is and then is able to quickly navigate from there. When asked how he did that, he replied that the person in the window held up a sign saying "You're in a helicopter" which is such a typical engineer's response (accurate and truthful but not helpful or useful) so the pilot knew he was at the Microsoft campus and was able to navigate from there back to the airport.

If you need to set a directory "on the fly" either based upon a value from a dialog or some other means there are two things you need to do and one caveat:

  1. Create a custom action that sets the directory value:
    <CustomAction Id="AssignWebDir" Directory="WEBSITEDIR" Value="[WEBROOTDIR][SITE.NAME]"/>

  2. Schedule the action during the InstallExecution phase (must be after the CostFinalize step):
    <Custom Action="AssignWebDir" After="CostFinalize"><![CDATA[NOT Installed AND (&WebServer = 3) AND NOT (!WebServer = 3)]]/>

The caveat is that at the time you've scheduled the custom action the directory name, which is also a property, has been resolved. In the above example, assume that the parent directory, named WEBROOTDIR, is the default "C:\Inetpub\wwwroot\". Under the parent directory is where you've specified WEBSITEDIR to be. Even though you may specify an initial directory name (e.g. "mysite") on the <Directory> tag in the .wxs file you must use the full path when changing the name - thus the new value in #1 above contains the parent directory name and the site name (which I prompt the user for).

To round out the notes for the above example - the <![CDATA[...]]> syntax around the conditions "preserves" the xml integrity because of the ampersand (&) character in front of the Feature name. The full condition is if the product is not installed (i.e. it's being installed) and if the Feature (in this case the "WebServer" feature) is being installed locally and it's not already installed locally. Finally the Directory= attribute causes WiX to create a Type 35 Custom Action whereas using Property= causes WiX to create a Type 51 Custom Action. I mention that because they're briefly mentioned buried in the .chm help although not the online docs and not in the tutorial (more on that in a moment). When you're searching and trying things it's easy to miss that bit of info.

My "beef" and the reason for the reminder of the joke is that you can find lots of bits and pieces of WiX and MSI info such as "You can use a Custom Action to set a directory location" but the person giving the answer stops short of mentioning the rest or giving an example. I'm not picking on that one person - this seems to be the style and culture of the mailing lists on SourceForge. That is, I can find dozens of responses on directories and properties with Nabble (which does a much better job of indexing, searching and providing results of the SourceForge mailing lists) but nobody bothered to offer a concise and complete answer.

Even the WiX Tutorial which I'm sure took a lot of effort and a long time to compile and update mentions the "facts" in bits and pieces: 3.2 shows both examples of dynamically setting a directory using either a Directory= or a Property= attribute but fails to mention the difference (Type 35 or 51) or impact. I say both because you'll find both suggestions mentioned repeatedly on the mailing list without a complete example or a clear indication of which to use and why. The same 3.2 section shows how to schedule the actions up above the examples but doesn't mention that directories have to be scheduled after CostFinalize and require a fully-replaced path be specified because it's been resolved. It's only through more searching elsewhere on the mailing list that you can dig up the fact about cost finalize and scheduling directory/file name changes afterwards. Although nobody spelled out the little tidbit about paths already being resolved at that point. Also, 5.6 mentions creating directories (this is where you'd expect to find the information for changing them) but neglects to mention they can be changed and that to do so, you use a custom action with links back to the relevant 3.2 section showing that. Going beyond this, if you want to put together a dialog to prompt and pass values, you'll have to spend a lot of time in Lessons 2 and 8 as well as the mailing lists. ;)

Finally, Rob teased us with his last entry on the directory table and Type 51 actions here but at the rate he's been able to post it'll be around the holidays before we get the answer. ;)

I'll end my rant with one more Microsoft observation - I searched Rob's blog for "msi directory". What's the very first thing displayed at the top of the list? Why the number of results AND THE NUMBER OF FREAKIN' MILLISECONDS IT TOOK TO FIND THEM! Yeah, that's what I'm most interested in - NOT! Then there's the random order to the results list - not by date or title or even number of comments - who knows!

This is the tired helicopter pilot signing off from the fog around Seattle :)

Original post

Friday, October 27, 2006

Generate WiX 2.0 Web Folder Fragment

This little utility grew out of need. I'm offering it up for free because it's so rough around the edges - no documentation, brute force approach, hardcoding, assumptions, etc. - you get what you paid for! It consists of a command script, a VBScript and an Xsl file - you can download it here. To use it, you also need another little utility I built which will generate a nested Xml directory listing that you can find here.

It will take an Xml directory file (from this previous utility) and generate a WiX 2.0 compatible fragment with the following features:

  1. DirectoryRef element so it can easily be referenced back to a Directory in a "controlling" WiX file.
  2. Component element for the web virtual directory with WebVirtualDir and WebApplication child elements.
  3. Component element representing the root of the web folder with a child elements for all Files found there, including both long and short names as needed.
  4. Nested Directory elements representing each of the sub folders under the root web folder (recursive) with all files.
  5. Ignores Visual SourceSafe (*.scc) files. ;)
  6. ComponentGroup element to pull together all the pieces of the web site so it can be easily referenced from a Feature (via ComponentGroupRef) in a "controlling" WiX file.

Of course, there's been limited testing (very limited, ok...I've only tested it on one project) so you may find some bugs. There's also some hardcoding and assumptions. Pay special attention near the top of the Xsl file where it's building the web virtual stuff (e.g. AllowSessions, DefaultWebSite, etc.). I didn't bother to get fancy and code for parameters.

Because of lack of time it carries a big assumption about the various artifact names - it uses the root folder name that you're building from. For example, if your web folder source is located at [C:\Inetpub\wwwroot\mywebsite] then it'll pick up "mywebsite" as the internal WiX name for the vdir, webapp, alias, etc. Finally, it picks up the auto generated GUIDs from the directory Xml ("ah, that's why he stuck them in there" ;) ) to use for component IDs and GUIDs. Therefore, this probably only has limited use as a first time tool since changing the GUIDs on each release of an MSI is generally frowned upon.

To use it simply drop the three files into your WiX build directory and invoke it from a command line: bldWebWxs "path_to_root_of_webFolder_source" wxsFilename

Hopefully, this will save you some time and get you most of the way there.

Enjoy!

Original post

Build XML Directory Structure

I've hacked together a little VBScript utility that will generate an Xml representation of a directory structure which you can download here. I looked around for something that already did this and came across Pat Coleman's DIR2XML utility. It provided a basis for what I needed but there were a couple of things I had to tweak - it generated a "flat" listing of all files and it produced an XHTML document via an XSL transform. Pat did a lot of the heavy lifting with the recursive directory scanning and getting file version information.

My bldDirXml.vbs does the following:

  1. Generates a nested Xml containing <folder> and <file> tags.
  2. Supports a "flattened" mode [-f] which pulls the information up into attributes of the folder or file tag instead of child elements.
  3. Generates a unique id for each folder and file.

To run it, use the following command: cscript bldDirXml.vbs "folderPath" [outputFilename] [-f]

All the usual disclaimers apply - it's a pretty brute force approach, not heavily tested, your mileage may vary, hold your nose if you look at the code, etc.

Enjoy!

Original post

Friday, October 20, 2006

Windows Live Writer Beta

Back in August, Jeff Julian posted a detailed explanation of how to configure Live Writer for posting to GWB. I had to hunt to find it again so I could install it on Vista RC2 so I figured this was worth a "link" here.

Original post

DHTML Editing Control removed in IE7+

This hasn't seen a lot of circulation (ok, I didn't pay much attention to it) and I was just bitten by it so I wanted to capture the links I found. The symptom is that using Exchange and Outlook Web Access from Vista (beta 2+) will not allow you to compose a new mail message. The DHTML ActiveX control for rich text editing doesn't get downloaded or instantiated.

This was warned about here back in June by B. Ashok on the IE blog. In there he also states "In the near future, we will also killbit the Safe for Scripting control in IE7 in Windows XP so that it will not get instantiated from the browser".

There is a critical patch for Exchange mentioned and linked to in his (her?) blog post which will correct this issue.

Also, I found a white paper on MSDN that talks about this control and ways to work around it. The white paper explains how Vista does not get the control automatically any more but that they are making available an MSI installer which will put it on Vista if you choose to manually install it.

Original post

Wednesday, October 18, 2006

WiX3 doesn't yet support COM+ installations

I found out while trying to make a COM+ installation that WiX3 doesn't yet support this. I was working on it originally with the latest WiX2 drop and ran into some difficulties (my own doing it turns out). While troubleshooting, I thought I'd give the latest 3.0 drop a try just in case some bugs had been fixed.

After backing up my *.wxs files and running them through the new wixcop using the -f switch to upgrade the schemas I received compile errors around the pubca extensions. Looking around, I found there wasn't any binaries for this in the drop. My next approach was to browse the CVS repositories thinking maybe they just weren't included in the packaging yet. Nope.

I jumped on the wix-devs mailing list and asked about the missing pubca stuff in Wix3. Kudos to Rob Mensching who responded within 30 mins. with the answer...the pubca custom action stuff has yet to be merged into the WiX3 codebase. So...if you're doing COM+ stuff you'll have to stick with WiX2 for now.

Original post

Friday, October 13, 2006

Fall in NH

Dave Burke (a.k.a "davebu") has Murray and Julie has her dogs, but down here in NH I've got Fritz and a flock of wild turkeys.

Original post

Sunday, September 24, 2006

Successful upgrade of DotNetNuke (DNN) v3 to v4

This weekend I upgraded our hosted church site to the current release of DotNetNuke v4.3.5. Here's a quick summary of the steps I took and some observations.

Steps to upgrade:

  1. Unzip the install package to a local directory.
  2. Copy release.config to web.config in the install root directory (local machine).
  3. Update the web.config: replace machine key with ours; comment out the SQL Express connection strings and uncomment the SQL standard strings; put in correct connecting string settings for the hosted site; uncomment trust level and change Medium to Full*; remove databaseOwner prefix.
  4. Backup hosted site to local machine using ftp client.
  5. Backup hosted database.
  6. Remove all files from root directory of hosted site.
  7. Remove all subdirectories except /Portals/0 which contains our site stuff.
  8. Ftp the site from local machine to hosted site.
  9. Switch the hosted site from ASP.NET 1.1 to 2.0
  10. Hit the hosted site for the first time - triggering an upgrade.

* Learned this after the fact - modules will not install since they're compiled on the fly and the zips are removed from the Install/Modules subdirectory. Full trust is needed to allow the ASP.NET application permission to read/load/delete the modules zips.

One observation I noticed is with User Defined Tables. I have several with a date column in them. In V4, UDTs have been enhanced to support Date, Time, and Timestamp types. For whatever the reason, the decision was made to upgrade an old Date datatype to a Timestamp and the time portion defaults to 12:00:00. Why they chose to do this is beyond me. If all we had was a date before, why not keep the datatype as Date. Perhaps it's because the old Date datatype was in actuality a full timestamp. For me, it meant going in to edit the UDT settings for dozens of tables to switch from Timestamp to Date.

Original post

Thursday, September 21, 2006

DotNetNuke (DNN) upgrade and the infamous Could not load type 'DotNetNuke.Common.Global'

I manage a DNN site which is currently running on DNN 3.1.1 hosted at WebHost4Life. I have a local backup of the site on my home server running on Win2003 and SQL 2005. I tried to upgrade the local copy to the current release 4.3.5 using the xxx_Install.zip and on the first try I received the infamous "Could not load type 'DotNetNuke.Common.Global'" error. I'm referring to it as infamous because there are a lot of howls posted over at the DNN forums site with very few real suggestions of getting around the problem.

To test the upgrade, I backed up my copy of web.config, used the development.config as a starting point, and put the correct SQL provider settings in. I used a diff tool to copy directly from the xxx_Install.zip to the website folder - replacing everything, removing what wasn't there any more and skipping the *.vb source files. When I first hit the site, I received the could not load type error. After hunting around on the web and finding little or no helpful information (such as this thread), I began trying to diagnose it myself.

Judging by the initial quick error, I figured it was something very basic with the configuration. However, I noticed in the event log details on the server that it seemed to be getting past default.aspx and was down in /install/install.aspx when it died. Hmm...remembering that all the modules are in zips now and get automagically deployed, I tried something whimsical - deleting all the zips (maybe it was trying to load these?) and voila - it got further along.

At this point, I tried a couple of things in quick succession (it was getting late last night). First, uncomment the enable medium trust  - the problem w/ non-colorizing IDEs such as Notepad is that comment markers are easy to miss. Second, I got a different error about missing a *.vb file. I recalled some discussions over on ScottGu's blog about the compilation model being different under 2.0 so on a whim I went back to the xxx_Install.zip and copied over all the *.vb files. Note, this is my "pseudo-production" server - I purposely did not install VS.NET to avoid the usual issues of "it works on my development machine". After that, the site came up minus the custom modules and the skins.

It's a start at least and I think I've stumbled onto something regarding what is going on during an upgrade. Unfortunately, I don't really have the time or luxury of building/rebuilding this to test/diagnose the issue further. I'm a big fan of cookbook-style guides so maybe I'll get to one soon. I hope to finish my testing/eval and get the site pushed out to the hosting company by the weekend. Also, this just shows me that I've got to spend some dedicated time working with 2.0, heck 3.0 is already starting to pop up!

Original post

Wednesday, September 20, 2006

DB2 UDB 7.1, JDBC, and NULLID.SYSSH200

This is an obscure problem which took a lot of digging. It was finally Roger Bjärevall's post here that lead to the answer. Roger is the developer of Minq Software's DB-Visualizer which is a great utility for connecting to a variety of databases as I often do.

The bottom line is you can't use Type 4 drivers to connect to DB2 7.1 so this one won't work. You have to configure the connection using the one installed with DB2 (\SQLLIB\java\db2java.zip). I got that far pretty easily. However, you need to specify the 'com.ibm.db2.jdbc.net.DB2driver' for the connection which isn't the first in the list of drivers. It was Roger's casual post that made it come together and finally work! Thanks for the tip and the great little product that's reasonably priced (remember when TOAD was cheap before Quest jacked the price up through the roof??).

Original post

Friday, September 15, 2006

Antec P160

Features:

  • 1.2mm anodized aluminum delivers increased rigidity and a finish that won't lose its luster.
  • Swiveling front control panel
    - Swivels up to 45 degrees.
    - Connectors: 2 x USB 2.0, 1 x IEEE 1394 (FireWire, i.Link) and 2 x audio jacks.
    - LED temperature display with two built-in sensors
  • Windowed side panel
  • Removable motherboard tray.
  • Accommodates any ATX12V power supply
  • 10 drive bays
    - Color-coordinated CD-ROM & floppy drive covers
    a. 4 external 5.25"
    b. 2 external 3.5"
    c. 4 internal 3.5"
    - Rubber mounting grommets in hard drive trays
  • Cooling capacity:
    a: 1 x 120mm low speed fan
    b: 1 x 120mm fan mount
  • Fan Specs:
    -RPM: 1600
    -CFM: 56.13
    -dB(A): 28
  • Built-in washable air filter
  • Removable Side Panel
  • Fits motherboards up to Standard ATX

DIY Workstation / Gaming Rig

Last winter I decided to upgrade my 6-yr old Dell 733 Mhz figuring it was long overdue. What got me started was a Christmas wish list over on ExtremeTech. Here's the specs:

It turned into a long afternoon/evening to get everything installed and ready but I was pleased that it booted up perfectly the first time I turned it on...

Original post

Thursday, August 31, 2006

Vista Pre-RC1 (build 5536)

Since my Beta 2 was acting very flakey I decided to format the partition and try a newer version. It installed cleanly and came up with no problems. One issue I discovered is that avast! antivirus (free for home use) does not work on build 5536. After installing and rebooting, you'll get a message that Windows Vista has disabled a faulty driver. Hunting around on avast!'s forums I found out that someone claims MS made a mistake in "banning" avast!'s driver and it would be corrected for the RC1 release coming soon.

Original post

Monday, July 10, 2006

DotNetNuke (DNN) Error "Invalid attempt to read when no data is present"

When upgrading from 4.3.1 to 4.3.2 I encountered the above error. A few posts in the DNN Forums suggested there were problems with the portals defined. In my case, I had one portal defined with id of zero (0) yet there was also a portal id one (1) called “My Website” with most other columns null. I started the 4.x journey with 4.0.2 and checking there I found only one row in Portals. So....perhaps the upgrade to 4.3.1 did something? Anyway, I had just tried my cookbook approach to upgrading posted about here so I simply deleted the database, recopied, fixed the aliases then deleted the spurious row. When I first hit the site it ran through the upgrade perfectly then I clicked the link at the bottom of the upgrade report page and was “in like Flynn”.

Original post

How to upgrade a DotNetNuke (DNN) site

I've got a pretty simple “cookbook” way to upgrade (or test upgrade) a DNN site which I thought I'd share. I'm using 4.x which means SQL Server 2005 and VS.NET 2005.

  1. Unzip the download to a new directory
  2. Make the new directory a virtual web share
  3. Copy/paste either development.config or release.config and rename to web.config
  4. Compare the old web.config to the new web.config (using a tool like WinMerge) and copy over the database connection string (edit to use new database name from step #5) and machine key.
  5. Copy the database - while only SQL Server needs to be running to use the site, to take advantage of the new copy database feature you must have SQL Server Agent and SQL Server Integration Services running.
  6. Open the Portal Alias table and fix up the URLs to match the new directory setup in step #2.
  7. Fire up a browser and hit the new URL - upgrade should kick off automatically and leave you with a link to the home page.

Now go cut the grass or whatever chores you've been putting off.

Original post

Friday, June 23, 2006

How to hack MDAC 2.8 off of a machine

I can't take credit for this as I found the tip here. Basically the preferred method is to have the MDAC installer for what you want *removed* and using a couple of less-known setup switches:

  1. use the /T switch to specify a temporary folder to extract the files in the setup
  2. use the /C switch to only copy the files out

Also, it's good to use the MDAC Component Checker to verify which version you have installed. Then grab the standalone installer for that version, unpack it, run the uninstall, reboot and you're good to go.

I downloaded the mdac_typ.exe and used a command prompt to execute the following:

mdac_typ /t:C:\mdac /c

I then switched to the C:\mdac directory just created and executed the following to uninstall:

dasetup /u

I happened to be running this on a Windows 2000 machine and after the reboot Windows Protection kicked in and tried to restore stuff. I cancelled it then tried Component Checker again and promptly got an error because MSXML wasn't installed. I then ran the install for the version of MDAC I wanted (2.7 SP1 Refresh). After it finished, I tried Component Checker again and it told me that version of MDAC installed was indeed 2.7 SP1 Refresh.

As with any undocumented hacks like this, your mileage may vary.

Original post

Monday, June 19, 2006

Vista Beta 2 not a good host for Virtual PC?

I posted here how easy it was to get Virtual PC running on Vista. I figured it was a great way to tryout Vista (my office/email/messaging desktop) while still being productive with VMs containing specific development environments. However, after about a week of use, I've decided I'm better off booting into XP to run VMs. I've been experiencing “lockups” of up to 1 minute or more which make effective work impossible. For example, clicking down through directories on the C:\ drive and suddenly Explorer hangs for a while. Sometimes, I'll see the network activity icon going, other times not. If anybody has tips to share, please do!!

My host machine is a Dell Inspiron 6000 760 M (2.0 Ghz) with 2.0 Gb of RAM. The VMs are running from a USB 2.0 attached 5600 RPM drive - I know, 7200 would be better. The images are all VPC 2004 SP1 with up-to-date VM additions updated. The kicker is the same image runs well when I boot into XP and run them there. Given everything identical except the host OS I can only “blame” Vista Beta 2 as the culprit.

Original post

Wednesday, June 14, 2006

Omea Reader under Vista

Omea Reader 2.1.1 (free version) installs and runs well under Vista Beta 2. I've discovered one quirk (design flaw?) though. I've set up a separate Data partition to make things easier dual-booting between XP and Vista. So after installing Omea under Vista, I hunted around for the settings on the XP install, found them under C:\Documents and Settings\{username}\Local Settings\Application Data\JetBrains\Omea and copied them over to the Data partition (e.g. F:\Omea). When I launch Omea under Vista and go to Tools|Options|Omea|Paths and set the database and log files path to F:\Omea it only accepts the new value for the Log Files. The database setting reverts to the standard Vista location E:\Users\{username}\AppData\Local\JetBrains\Omea. Seems like a bug or perhaps a design flaw?

For the record, after exiting Omea and copying over the settings to E:\Users\{username}\AppData\Local\JetBrains\Omea, replacing what's there, then restarting you'll find all your feeds, subscriptions, clippings, etc. are there which was a nice thing. Now the problem is remembering which OS I can run Omea in so I don't lose any data!

I wonder if there's a registry or file hack where the value is stored for the database path? Hmmm...

UPDATE: I copied over the contents from the \User location in Vista to my data partition (F:\Omea ), updated the registry (HKCU\Software\JetBrains\Omea) values to F:\Omea , moved the folders in the \User location to the Recycle Bin then restarted Omea. Success!

Original post

Tuesday, June 13, 2006

Great non-destructive partitioning utility - and it's free!

After testing Vista Beta 2 running inside Virtual PC and then on an older machine, I decided the next step was to take the plunge and setup my main laptop for dual boot. At this point, it “hovered” on my mental to-do list until I saw this article on Lifehacker. What really caught my eye was Gina's mention of a comment posted regarding GParted - an open source partition manager. After reading the article here I headed over to the project workspace and downloaded it. Bottom line - it works as advertised and is a great addition to the toolchest. The ISO is only about 30 Mb and contains a bootable version of Linux. After paying for Partition Magic v3, v4, v5, v6, and v7...I'm done shelling out money for upgrades!

Original post

Virtual PC running on Vista host

I looked for an answer but few people seem to have posted about this one so I figured I would in case others asked. After trying Vista *inside* a virtual machine, I next wondered if I could install Virtual PC onto a Vista host. After all, running Vista “right on the metal” of a real machine is much sexier than inside an emulated VM :). After a false start of trying to launch the MSI I found a suggestion somewhere that you had to run the Setup.exe stub which worked great. So...when trying older installers, use the Setup and it seems to work well. BTW, Virtual PC 2004 SP1 works well under Vista so migrating over is much easier. You *ARE* using VMs for development, testing, etc. right??? Doing so makes the physical machine irrelevant and recovery from hardware failures much faster.

Original post

Windows Vista Beta 2 and missing msvcr71.dll

While trying out various things on the Beta 2 (Build 5384) I ran into a missing msvcr71.dll while trying to use Password Safe. Since I was dual booting with XP on another partition, I copied over the file from \Windows\System32 on the XP partition. My super scientific methodology was to try launching again. Next it was msvcp70.dll and then finally msvcp71.dll. After that the application came up fine. Soooo...if you're missing any of these three, grab a copy of them from an XP installation and just simply drop them into \Windows\system32 on the Vista partition. Seems to work okay ;)

Original post

Tuesday, March 28, 2006

Dynamic DNS

As I previously mentioned, I've got fiber running into our home now. One of the issues I discovered was that, unlike Comcast, the IP address changes frequently and the DNS name is built dynamically from the IP address. If you have a need to get to your machine then a "dynamic DNS" service is one way to go. There are several out there which offer free services - I choose DynDNS. Here's a few things I discovered...

After signing up for an account, you can create a dynamic DNS using one of their 25+ domain names such as .blogdns.org, dyndns.org, homeip.net, or even kicks-ass.net. With this DNS entry you can supply the IP address it points to so "mycoolname.homeip.net" will route to the IP address of your home computer.

There are free utilities which can automatically update your DynDNS account with a new address. DynDNS Updater is one I found which works well. You install this on your home computer and leave it running (either in the system tray or install it as a Windows service). It will periodically check your external IP address and, if it's changed it'll update DynDNS. I later stumbled on a casual mention in a forum that D-Link has the ability to update a dynamic DNS built right in. After logging into my router, sure enough I found it under Tools / DDNS. I figured it was the best way to go as it would be running on my router, know when the IP had changed, and save from installing running one more thing at home.

However, DynDNS has a pretty good FAQ which brought a couple of things to light. First, the only "recommended" router is a Linksys model. Second, they feel that the software method is better because you can configure various settings and the logging capabilities are much better. I've experienced it first hand. One of the times my IP changed the D-Link DDNS didn't work and, as DynDNS said, I had no idea why. After that, I turned it off on the D-Link and re-enabled the software which has been running fine since. I keep a little server going at home so it wasn't a big deal.

To avoid problems with having to run a web server on ports other than 80 you can use another free DynDNS service called "WebHop". With that, you specify a name such as web.mycoolname.homeip.net and redirect it to a URL such as http://mycoolname.homeip.net:8080. That way you can bookmark and give out a "standard name" without having to remember the port and it'll get redirected to the correct port at your dynamic domain name. This also makes it easier if you embed links as you can use this aliased name and only maintain the actual URL in one place.

Original post

FTTP, baby! (Verizon FIOS)

We're finally on fiber at my house, na-na! Dan Bricklin has a long article with lots of photos on his blog so here's some random points I've discovered since the install.

  • Earlier in the week the fiber line was brought down from the pole to the outside of the building in preparation.
  • The install really does take 4+ hours.
  • The final connection into the "ONT" was made from the feed left earlier using a portable splicer that melts/fuses the glass fibers together
  • The glass really is about the size of a human hair
  • The meter reported 18 Mbs download at the ONT
  • Verizon offers 5 Mbs down / 2 Mbs up for $34.95 and 15 Mbs down / 2 Mbs up for $44.95 (both with a one year agreement, phone call to upgrade)
  • They also offer a 30 Mbs down / 5 Mbs up
  • Unlike high-speed cable, Verizon installs a battery backup in case of power outage that's rated for 4 hours (tech said he's heard of customers getting 8 hrs of life from them)
  • They pull the old copper wires from the house during the install
  • I read in a forum somewhere that all the old regulations regarding access and required sharing of lines with other providers only pertains to the copper lines, not the new fiber ;)
  • They give you a D-Link DI-624 wireless G router which also has four ethernet ports
  • Vonage's Linksys router is now "hanging off" (behind) the D-Link and it only required a power off/on to pick up new address from the D-Link
  • D-Link's default internal IP is 192.168.0.x whereas Linksys (w/ Vonage) was 192.168.15.x
  • D-Link's router really is true port forwarding - you can, for example, map port 8080 outside to 80 inside (Linksys forwards a port to an IP only - same port)
  • Verizon blocks port 80 (port forwarding works around this)
  • Fiber uses PPPoE like DSL does
  • In the span of 4 days I've had 4 different external IP addresses assigned
  • External DNS names are dynamic, containing the IP address (e.g. pool-x-x-x-x.subdomain.fios.verizon.net)
  • Discovered free dynamic DNS service (dyndns.org) to overcome name/ip changes

A big SORRY to Dave Burke - I'm sure it'll be a while before fiber finds its way up into Vermont.

Original post