Fixing POST errors that seem almost random

Recently my webhost installed mod_security on my webserver. It seemed like a non-event until I started getting calls from a few clients who had been merrily working for a long time. When they tried to update content in either Open-cart or WordPress, they were getting strange errors such as “501 Not Implemented” or “406 Not acceptable”.

The fix is to sidestep mod_security for those requests. Ideally it should be limited to the offending request, but in the interest of getting people running, I simply added a .htaccess file in the /admin folder of the broken sites with the following contents.

[code]

SecFilterEngine Off
SecFilterScanPOST Off

[/code]

Domain Registration – on my schedule!

Working as a freelance web developer means that I work really strange hours. I may not get to a project until my kids are in bed and keep working late into the evening. When there’s something that needs to be done, nothing ticks me off quite like a broken website that requires a phone call to customer support. Once or twice I’ll let someone get away with it, after-all sometimes stuff breaks. That’s just the way it is. But when it came time to renew my website domain – the face of my freelancing business, and I discovered yet again that the domain renewal pages of my registrar *still* would not allow me to renew online I nearly lost it. I called up their customer support where the person on the other end asked me for my username and password so that they could enter the transaction for me. Yeah right. It was time to find myself a new registrar. After the usual poking around the Internet and some discussion among my associates, I settled on http://www.namespro.ca/. They’ve been in the business since 2003 and are a Canadian business so I feel like I can trust them to manage my domains properly. I signed up and was able to quickly and easily move over my domain. The website is easy to follow and the domain management pages are clean and easy to use. I sent a couple questions out to their tech support and received prompt replies that made me very happy with my choice. I’ve since moved a couple more domains to them and will continue to chance my domains over to namespro as they come due. Managing my domains from a proper web interface on my own schedule has never been simpler!

Resize and Rotate

Last week my laptop died so when the new one arrived I started re-installing all those apps that make it work the way I want. This post is about one of those *must-have* applications. I manage all may camera images on my laptop so being able to quickly resize or rotate images is pretty important to me. Luckily a friend showed me an easy way to do bulk rename and resize with a simple right-click.

Installation

To add the new options to the Nautilus context menu we’ll first need to install the package:

[code] sudo apt-get install nautilus-image-converter [/code]

You’ll need to restart Nautilus or simply logout and log back in before you get the new right-click menu options. After you’ve done that you will be able to right-click on any image file and you’ll see the two new menu items: “Resize Images” and “Rotate Images”.

When you choose one of the menu items you will be presented with a simple interface to provide the relevant input parameters. What could be easier?!

Using Cron jobs from CPanel

I’ve set this up before but couldn’t remember just how I did it. Instead of having to figure it out all over again next time, I thought I’d write it somewhere.

Setting the time in CPanel’s Standard Cron UI is very simple so I won’t bother with that but the command for triggering a web page has a few things that messed me up.

For those who don’t care to read much, here’s the command:

[code] wget -O – -q -t 1 ‘http://www.mydomain.net/doit.php?id=12345’ >/dev/null 2>&1 [/code]

Or, if you’re using a .htaccess username and password, like this:

[code]
wget -O – -q -t 1 ‘http://username:password@www.mydomain.net/doit.php?id=12345’ >/dev/null 2>&1
[/code]

Of course you’ll need to insert your own url but you can otherwise cut and paste the line above and start running your php script as a cpanel cron job. But what does it all mean?

“wget” is a linux command line utility to fetch the contents of a web page. In our case we don’t care about the output of the page, only that it gets triggered – or ‘looked’ at.

“-O -” says to discard the output (aka, the page contents), we’re not interested to keep these!

“-q” put wget into quiet mode, I don’t want to know about errors here either.

“-t 1” tells wget to only try once, if the page doesn’t work for some reason, give up.

The url should be wrapped in single quotes! This is an easy one to miss because in a simple case, it’s not required but if you have url parameters it’s a must otherwise the url gets truncated and the request won’t work as you planned it to.

“>/dev/null 2>&1” This tells cron *not* to send out a notification that the script was called. The details of this part gets a bit hairy but I’ll do my best to explain. As the script I’m triggering already sends an email when it completes, I don’t really want the one that is sent out automatically each time the cron runs. As well, the subject for the email is the complete cron command. I don’t know about you but if your scheduled cron command includes anything even sort-of private, I’d prefer not to blast it out in email every day! Consider if you included a URL to a script behind an htaccess password, the username and password would be right there in the subject of your email every day!

The first part, “>/dev/null” redirects the output (standard out) of the command to /dev/null (the Linux equivalent of a black hole!). The second part, “2>&1” redirects ‘standard error’ to ‘standard out’. The short story is that there is no more output from your command and therefore there’s no email to be sent out!

Recovering files deleted from an SVN repository

This morning I find myself in need of a file that used to exist in an SVN project. The file was deleted several months ago and I don’t have a copy anywhere else. Finding it in my SVN repository turned out to be quite simple.

Run the following command to dump the complete activity log from the project.
[code]
> svn log –verbose > myLog.txt
[/code]

Then simply open the ‘myLog.txt’ file with your favourite editor and hunt for the last mention of the file you need. I’m looking for “register.module.php”, so for me that looks like this:
[code]
————————————————————————
r428 | mvoorberg | 2010-11-15 22:06:33 -0700 (Mon, 15 Nov 2010) | 2 lines
Changed paths:
M /faq.php
M /login.action.php
D /register.module.php
[/code]

Use the following command to restore the file to the local filesystem at which point you can copy it out or do whatever you like with it. Notice the version number (427) is one less than the version that included the Delete action above.
[code]
> svn up -r 427 register.module.php
[/code]

Once I copied the file out, I removed it again by simply running:
[code]
> svn up
[/code]

Don’t forget to remove the ‘myLog.txt’ file when you’re done with it.

Moving Joomla site to a subfolder

Before rebuilding this website, I wanted to make sure I was able to keep the old site in a subfolder to help keep my search ranking and not break any external links to my site. This is clearly a job for modRewrite so I had a look around and goolged things like “Joomla to subdirectory” and the like. Funny enough I couldn’t find any posts about it and no-one had posted the required .htaccess entries to make it happen. As I’d never used SEF urls with my Joomla site and I wasn’t going to use Joomla for the new website, it was easy to find a pattern that all my urls would match. I could simply send anything that contained the string “option=com” to my “archived” subfolder.

Here’s the entries I added to the .htaccess in the root of my website:
[code]
# Make sure this module is turned on
RewriteEngine on

# Match any Joomla Urls
RewriteCond %{QUERY_STRING} ^option=com(.*)$

#Only forward Get and Head requests to the archived site
RewriteCond %{THE_REQUEST} ^(GET|HEAD)

# Don’t match urls that already contain the archived subfolder
RewriteCond %{REQUEST_URI} !^/archived/$

# Send a 301(Permanent Move) to the archived folder
# and don’t process any more rules(L)
RewriteRule ^index.php$ /archived/index.php [L,R=301]

[/code]

Follow-up:
As it turns out you can’t set the database permissions for specific tables when running in a typical shared LAMP environment. I was hoping for SELECT only and INSERT on some Joomla session tables. I noticed that I was still getting some new spam user registrations on the old site so I’ve disabled POST requests with the following entries added to the .htaccess in the /archive folder of the old Joomla site:
[code]
# Make sure this module is turned on
RewriteEngine on

# Disallow POST requests to the archived site by sending a 404 error
RewriteCond %{THE_REQUEST} !^(GET|HEAD)
RewriteRule ^ / [R=404]

[/code]

Converting RAW images to Jpeg on Ubuntu

You’ll need to install a few packages before this works:
[code]
sudo apt-get install netpbm
sudo apt-get install dcraw
sudo apt-get install libimage-exiftool-perl
[/code]
Once you’ve got these installed, put the following in a file in ~/.gnome2/nautilus-scripts/ and restart Nautilus to add it to the right-click context menu.

[code]
#!/bin/bash
files=$#
count=1
message=`echo “Converting $files RAW files to Jpeg”`

(while [ $# -gt 0 ]; do

# Strip off the file extension, including the “.”
upperExt=`echo $1 | sed ‘s/.*\(\..*\)/\1/’ | tr ‘[a-z]’ ‘[A-Z]’`

if [ -f “$1” ]
then
# Get the file name without the extension
trimmed=`echo $1 | sed ‘s/\(.*\)\..*/\1/’`

if [ $upperExt = “.PPM” -o $upperExt = “.CR2” -o $upperExt = “.NEF” -o $upperExt = “.OTHER_RAW_EXTENSION” ]
then
if [ $upperExt = “.PPM” ]
then
# Convert the PPM image to a Jpeg
pnmtojpeg “$1” > “$trimmed.jpg”
else
# Convert the RAW image to a Jpeg
dcraw -c -w -h -b 1.0 “$1” | pnmtojpeg > “$trimmed.jpg”
fi
# Copy EXIF data to the new Jpeg image
exiftool -overwrite_original -TagsFromFile “$1” “$trimmed.jpg” >/dev/null

# Set the Jpeg’s file timestamp to match the EXIF date
dcraw -z “$trimmed.jpg”
fi
fi
# Output the zenity progress bar
sav=`echo “(($count / $files) * 100)” | bc -l`
echo $sav
count=`expr $count + 1`
shift
done) | zenity –progress –auto-close –auto-kill –text “$message”

[/code]

You’ll have to make the file executable by using:
[code]
chmod +x raw2jpeg.sh
[/code]

Update:
I modified the script to convert PPM files created by UFRaw as well. Also now using pnmtojpeg instead of ppmtojpeg.

Mounting NFS shares on my home network

I use NetworkManager to connect to my wireless network and wanted a way to map the drives automagically when I connect.  However, as I connect to a number of wireless access points I didn’t want to send my credentials trying to find my home server out each time I connect at the coffee shop.

Enter the NetworkManager dispatcher.  Scripts in /etc/NetworkManager/dispatcher.d/  are executed when the network manager connection is made.  I created the following file as “99mountmydrives” and chmodded it with execute permissions.

See the script below:
[code]
#!/bin/sh

IF=$1
STATUS=$2
USER=myusername
SERVERIP=192.168.1.200
SERVERMAC=00:aa:bb:cc:dd:ee

echo “running 99mountmydrives as `whoami`;” >> /tmp/dispatcher-log
echo “int=$1 and status=$2” >> /tmp/dispatcher-log

wait_for_process() {
PNAME=$1
PID=`/usr/bin/pgrep $PNAME`
while [ -z “$PID” ]; do
echo “waiting.” >> /tmp/dispatcher-log
sleep 3;
PID=`/usr/bin/pgrep $PNAME`
done
}

mount_drives() {
echo “waiting for nm-applet” >> /tmp/dispatcher-log
wait_for_process nm-applet
echo “mounting drives” >> /tmp/dispatcher-log
/home/mvoorberg/automountmyth.sh
}

if [ “$IF” = “wlan0” ] && [ “$STATUS” = “up” ]; then
#WIFI Subnet at home

test=`ping -c 3 $SERVERIP >/dev/null;arp -a $SERVERIP`
echo “$test” >> /tmp/dispatcher-log

if expr “$test” : “.*$SERVERMAC” 2> /dev/null ; then
mount_drives
exit $?
fi
fi
[/code]