This is My Forty

I’m writing this entry late in the evening of my fortieth birthday. Tonight I sit here taking a few quiet moments to let myself recharge after a number of very social days. The importance of a single birthday because it’s a round number like forty can be debated, but I’ve been asked a few times my thoughts on this birthday the last few weeks so I thought a moment to reflect wouldn’t be out of line.

Having a birthday less than two weeks before Christmas means that I’m used to it being a busy time. There is the normal activity as the days count down to the holiday season and the end of the year. Having worked in higher education for the last thirteen years it’s also a time that marks the end of a semester, student exams, and graduation ceremonies at work.

Even with those factors, I’ve been busy lately. Earlier this year had challenging times, but as summer turned to autumn I’ve found myself both fairly busy an mostly happy. I don’t think those are unrelated. I’ve opened myself back up more socially after closing off for a while. Meeting people without expectations has left me busy, at time even hectic, but enjoying life as it comes by.

As I reach forty I find myself happy. Not all of my life is perfect, but is life ever? It is pretty good. I’m in the best health I’ve been since I was a teenager and the best shape of my life. When I look forward it’s with optimism and the feeling that my best years are coming more than they have passed me by. And that is truly I think all I could ask for any day.

Port Forwarding in Windows

I always enjoy finding something new that meets a need. As you might guess from the title, I found myself recently needing to forward a port on a Windows server. The scenario is that I had a server I need to allow access to from an network that hadn’t been originally planned to do so. I could have just opened a firewall port, but I prefer to set up a more secured method.

For web connections this can be done using a reverse proxy pretty easily and Windows 2012 server even includes a wizard to make this easier to set up. In this case I needed to forward an arbitrary port to the same port on another server.

It turns out this functionality is built into Windows and has been since at least 2008 and the command to do so is pretty simple.

netsh interface portproxy add v4tov4 listenport=80 listenaddress 10.0.0.1 connectport=8088 connectaddress=192.168.1.75

This binds port 80 on address 10.0.0.1 on the local server and forwards any traffic received on this port to port 8088 at address 192.168.1.75. The response is also returned back through the proxy to the source server. It works quite nicely in early testing and fills a need I’ve always had trouble finding a good, reliable solution for on Windows. There are only a few limitations I’ve found so far. From my reading it seems to require IPv6 to be installed to work even if you’re not doing an IPv6 connection. It also cannot bind the localhost addresses which limits use in development scenarios. Documentation on the command is at http://technet.microsoft.com/en-us/library/cc731068%28v=ws.10%29.aspx.

As implied by the v4tov4 portion of the command, you can use this to set up proxies between IPv4 and IPv6 servers. That should come in handy when migration to the new IP version comes over the next few years.

Article Published on Tuts+ Code

My article on Securely Handling User’s Login Credentials is up on Tuts+ Code.

For most websites, you have different areas within it (home page, user profile, admin page, etc.), some of which will be public and others will need to be restricted to only certain users. You often want to uniquely identify users so you can provide customized content or to capture specific information from a user. Many sites also need to protect part of the site, such as an administrative area to maintain and update the content of the site. In a CMS site, some users may be able to create content, but others must approve that content before it is shown to the public.

Read the Rest.

Better Weigh in the App Store

So earlier this year I decided to write an app for the iPhone. In my case I wanted to loose a little weight before working to add on some muscle for a planned summer trip. I’d not been particularly happy with anything I found to track my weight before, so I decided to write my own. Thus was born Better Weigh.

 better-weigh-screenshot

The app focuses on helping you track your weight and spot trends such as subtle weight gain before weeks of dieting is required to lose unwanted weight. It works if you’re looking to lose weight, gain weight, or just maintain your weight

Just normal changes from diet, exercise, and other activities can cause your weight to vary by several pounds per day. These daily swings make the real changes over time of your weight hard to track. Better Weigh smoothes out these variations showing you how your weight is really changing and helping you reach your goal.

You can enter your weight manually or sync with FitBit. More syncing options are planned. You can find it on the App Store or see more info at http://betterweigh.me.

Cisco ASDM gives “Missing required Permissions manifest attribute in main jar” Error

Java pushed out a new update recently that implemented a change that had been warned about for a while. by default it now no longer allows running apps that are unsigned, self-signed, or without permission attributes.

This includes Cisco ASDM manager. The quick fix (other than Cisco adding the missing permission attributes file) is to add the web address where ASDM is accessed as an exception. You can do do this in Java Settings control panel. This is reached under Window by the Java option under Control Panel.

There select the Security tab and click the Edit Site List… button next to the Exception Site List. Here add the URL to the firewalls that you access with ASDM. After doing this, ASDM will connect and work again normally.

Interestingly enough given Java’s sudden concern about my security, it still asks to install the Ask Toolbar every update….

More info on the change at http://www.java.com/en/download/help/java_blocked.xml

Goodbye AppStorm

Saw today that AppStorm is shutting down. I hate to see that. I admittedly am biased as  I wrote for the Mac and Windows sides of the sites  through late last year. I always felt the site had good reviews and found a number of useful tools and apps through the last few years. More worrying is another quality site goes down as losing money while the “Ten Surprising Ways Your PC Can Hurt Your Cat” sites keep going.

Encoding CSR on Exchange 2010

Mostly writing this for my own benefit as I have to do this a couple times a year and always have to look it up. By default if you request a new or renewed certificate signing request on Exchange 2010, it comes out as a binary file that almost no certificate authority accepts. They want a base 64 encoded file instead.

It’s easy to convert the binary file to base 64 though using the certutil utility using the encode option.

certutil -encode C:\renewal.req C:\renewal.csr

This command encodes the binary file renewal.req into a base 64 encoded file renewal.csr that will work with any certificate authority.

Fixing a Lightroom Catalog

The biggest risk of any type of database file is corruption. Sometimes you can fix it, but too often the only way to recover from a corrupted file is to restore a backup from before the problem showed up and rebuild or recreate anything lost. For full database servers there are ways to minimize these problems, but for personal catalogs not so much. Major corruption let’s you know, often with a corrupt file message when starting the program. When the corruption is subtle you may not know it until it’s too late to easily recover.

I do a lot of photography and organize my work using Adobe Lightroom. At heart the Lightroom catalog is a specialized database storing information about the photos and the data you’ve attached to them. I found myself seeing an odd error whenever I would take an image into Photoshop for editing, the edited photo would not show in Lightroom as it should. After searching the Internet and talking with Adobe support, I confirmed the catalog was the problem.

I now faced =the prospect of either creating a new catalog, importing my photos, and then rebuilding lost data or rolling back to a several month old backup and redoing every import and edit since then. The later might not have been a bad option except I’d done quite a few of both the previous few weeks. EIther way I’d have to hope I didn’t miss anything. Neither felt like a particularly good option.

I began to look for ways to possibly pull the data I couldn’t normally save, such as pick/reject flags, from one catalog to another. I knew there was an SDK to create plugins and tools to work with Lightroom and I began to think of something to export everything I cared about into something like an XML or CSV file and then import it again.

I had no luck finding an existing app or plugin to do this, but during my search I learned that the catalog file in fact is a database. It’s a fairly common database format known as SQLite. This led me to the hope that I could extract the data I wanted using database queries. All those years writing web apps looked to be about to pay off in getting my data from the corrupted catalog.

I found two articles on the web at http://gerhardstrasse.wordpress.com/2010/08/19/recover-from-a-corrupt-adobe-lightroom-catalog-file/ and http://www.simplyness.com/more-photography-tips/recover-corrupted-unrepairable-lightroom-3-catalog-with-sqlite.html. Neither of these articles worked perfectly for me, but did get me in the right direction.

Without diving too deep into the technical details, SQL databases are a fairly common database structure and SQL is the language used natively to create and manipulate those databases. The process described involved converting the database into a text file that contains a series of SQL commands that could then be used to create the database.

First I downloaded the command line tool to deal with SQLite databases from https://www.sqlite.org/download.html.  I downloaded and unzipped the shell binary for Mac OS resulting in a program that could be run from the command line to manage a SQLite database. I move the sqlite3 binary to my home folder along with a copy of my catalog file leaving the original safely put away in case this didn’t work. I then used the following to dump out the contents of the database into a text file containing the SQL commands needed to create that database:

echo .dump | ./sqlite3 ~/Lightroom-3-Catalog.lrcat > ~/Lightroom-catalog.sql

The vertical bar (|) breaks this command into two parts. The first part takes the characters .dump and sends it as the input to the second part. The effect is the same as typing those commands after the second part of the line runs. The rest of the command executes the sqlite3 binary I downloaded giving it my catalog file as the database (and yes I’ve been using this catalog since Lightroom 3). The .dump command tells SQLite to display the text commands it would take to create the database. At the end the greater than sign then tells my computer to send that text to a file named Lightroom-catalog.sql instead of displaying them on the screen.

So I now had a huge text file instead of a unreadable catalog file. Some articles I read noted common errors seen in the SQL commands, but my scan of the data found nothing out of order. So now that I had a text file I wanted to create a new database using this command:

cat Lightroom-catalog.sql | ./sqlite3 ~/Lightroom-Catalog-Repaired.lrcat

This command is again split into two parts. The cat command takes the contents of the Lightroom-catalog.sql file we just created and normally sends them to the screen. As before though the vertical bar instead sends them as input to the command that follows the pipe. This command creates a new database with the name. In effect the entire contents of the 600+ MB text file is automatically typed in.

I moved the new catalog file back to my Lightroom folder and opened it. Behold everything was there and all looked good. Only problem I ran into was that when I next imported photos into Lightroom it saw the parent of the folder holding these new files as different than the original folder in spite of being the same. It made no sense to me, but was easily fixed by clicking on each subfolder and using the locate folder to get everything synced up.

It’s been a bit over a month now and all is still working well. Hope that helps anyone else running into this problem.

Managing Time Machine Backups to Windows Server Continued

Back in June I posted my experience getting Time Machine backups to work with a Windows Server as my main storage. It worked well, but had three problems. First I’d often get an error as it tried to connect to my server when I was away from my home network. I also had to manually mount the drive before the backups would run. Third, I sometimes found issues if I put the computer to sleep in the middle of a backup and woke it up off the home network.

To fix these issues I came up with a few scripts to address those. This first pair are designed to run when I come to my home network and consists of an Apple Script and a shell script.

I’d started these over a year ago, but never perfected them and lived with the limitations. I’d worked on them off an on, but didn’t complete them until I had to recreate my backup system around the time of that last post. The code for starting and stopping Time Machine came from http://apple.stackexchange.com/questions/11177/quicksilver-accessible-script-for-disabling-and-enabling-time-machine and the other code came from places now lost.

First the Apple Script to start Time Machine backups:

   1:  tell application "System Preferences" to activate
   2:  tell application "System Events"
   3:      tell process "System Preferences"
   4:              click menu item "Time Machine" of menu "View" of menu bar 1
   5:              tell button "ON" of window 1 to click
   6:      end tell
   7:  end tell
   8:  tell application "System Preferences" to quit
   9:   
  10:  tell Application "Finder"
  11:  Mount volume "cifs://<SERVER>/<SHARE>"
  12:  end tell

Lines 1 – 8 simply start Time Machine by in effect going into preferences and turning it on. Lines 10 – 12 then use mount the server share containing the disk image.

Next I need to mount the disk image. This is trickier than it might seem in my case as I had encrypted the image. I didn’t want to have to type the password each time since I’d already saved it in my Keychain. So I combined a script to pull the password for the disk image named MacBook-Backup.sparsebundle. This would need to change to the name of your disk image if running the script. I then pipe the output of that command to the mount command to mount the disk image. This in effects types in the password read from the Keychain to the prompt when I mount the drive. The path (here /Volumes/TimeMachine/MacBook-Backup.sparsebundle) would need to be changed if you use this script.

security find-generic-password -w -D "disk image password" -l "MacBook-Backup.sparsebundle" | hdiutil attach /Volumes/TimeMachine/MacBook-Backup.sparsebundle

Finally a script to turn off Time Machine.

   1:  tell application "System Preferences" to activate
   2:  tell application "System Events"
   3:      tell process "System Preferences"
   4:              click menu item "Time Machine" of menu "View" of menu bar 1
   5:              tell button "OFF" of window 1 to click
   6:      end tell
   7:  end tell
   8:  tell application "System Preferences" to quit

There are several ways to use these. I initially ran them manually when needed. For automation the best method I found was to use Keyboard Maestro’s ability to run scripts when a wireless network was either connected to or disconnected from. I ran the first two scripts when I connected to my home wireless network and the last when I disconnected from it. I used that process for about a week and a half and found it worked very well.

Since then though I’ve moved from the Windows Home Server to a new Synology NAS. This new NAS supports native Time Machine backups using AFS so I no longer need the disk image process I detailed here. It worked for me well over a year with only one problem and the scripts worked for about a month so I’d feel comfortable going back to them if the need arises.