Archive for August 27th, 2012

nginx testing

As part of my web optimization interest, I’ve been looking into alternatives & add-ons for the traditional LAMP stack.

The most obvious thing I’ve found is replacing Apache with Lighttpd or nginx. nginx seems to be more updated (based on what I’m remembering of a cursory web search a while ago), so I’m focusing my efforts on that.

I’m also looking at PHP-FPM instead of proxying PHP requests to Apache. And Varnish, which I’ve been told good things about by Brian. And a PHP accelerator too. APC or memcache/ Something like that.

I’m planning on using an Amazon EC2 instance to play around with nginx and everything else… get some use out of my credit while I have it.

Read the rest of this entry »

, , ,

No Comments

Stripping image metadata in Python

Currently, as part of my file deduper,  I’m opening images, copying the image data to a new file, and saving that file.This is done using the Python Imaging Module, or PIL. I’m actively using an updated fork of it called pillow.

However, I’ve since discovered pyexiv2, something that says it allows the “access of … XMP metadata”. Which means it might allow XMP data access in .DNG files. The exiv2 docs say this is possible, at least.

This interests me because using DNG was something I tried, but then went back to .CR2 simply because using .CR2 forced Lightroom to keep the metadata & data in separate files. I do have a bunch of .DNG files from that period though that aren’t working with my deduper because the metadata changed enough, so this module would probably be a good addition to my growing toolkit.

My chosen method of copying & wiping metadata, like I did with mp3 files, should work the same way. Just a matter of finding the pyexiv2 command for deleting image metadata, something which exiv2 states is possible, but I can’t seem to find in the pyexiv2 docs.

, , ,

No Comments

Rebuilding a partition table after failed resize

So. When I decommissioned my NAS, I moved stuff off the drives incrementally. My RAID5 media partitions were the first to get nuked, and I reformatted that partition on one of the drives as an NTFS partition, for use on my desktop. I then moved my pictures off the RAID 6 onto that NTFS partition, and wiped the RAID6 partitions.

I ended up with 500GB of unallocated space, followed by 1.36TB of a NTFS partition with my pictures on it. So I wanted to make use of that 500GB of space. So I tried in in-place extension of the free space. Windows Disk Management didn’t allow it, so I tried GParted, and GParted worked. I now had a 1.86TB partition. I unplugged the drive and packed up for Canada, without testing it.

Now, I’m in Canada, and have discovered one crucial thing: GParted didn’t move/copy the NTFS data structures backwards, so the partition is unrecognizable. (Why it allowed this is a question I haven’t tried to answer, but might in the future.)  I haven’t reformatted the drive, so I should be able to get everything back if I rewrite the partition table back to its original state.

Problem is (when is anything ever simple) I don’t have a record of the partition table. I know I set it up as 500GB in one partition, and remaining space in the second when I first installed Fedora. But I don’t have the exact partition boundary. Thankfully, because I have a backup (with Crashplan, who I’m still annoyed with), I won’t have any data loss. But I’d prefer not to redownload 700+GB of pictures, so onward with mucking around with partition tables instead!

First off, data recovery is not what I’m looking for. So PhotoRec, which I remember using with good experiences, is out. (As is PC Inspector File Recovery.) What I’m trying to find is either a partition recover program, which would automatically recover everything, or a partition identifier program, which I could then use in conjunction with manually editing the partition table (yay parted, or alternatively, hex editors!).

Googling “Partition recovery software” gave me a few programs, most of which I ignored because they appeared to be the general commercial file recovery programs that scan your drive and attempt to detect files based on their headers.

I did find TestDisk – which is completely freeware, Active Partition Recovery – DOS version is freeware, Windows you have to pay for, and Find & Mount – crippled freeware, speed is limited to 512KB/sec.

So I started off with TestDisk, and ran that for a while. Then I interrupted it and restarted it, thinking I had passed the wrong options to it. And then I let it run for a while, but it still didn’t find anything, so I interrupted it to shutdown everything to fly to Halifax.

Which is where I am right now, without the drive, which is still in Waterloo. But I can plan a course of action – something which is probably better than trying stuff when I have the drive in front of me.

So my plan is to try TestDisk, get it to run the deep scanning method to try and discover the start of the NTFS partition boundary. I’m going to see if I can find an option that will specify searching from the 499GB mark onwards, since I’m quite sure the boundary’s at the 500GB mark. (There’s a step by step guide on the Testdisk wiki, but no mention of command line options.) If I can’t, then I’ll let Testdisk run overnight and see if that finds anything.

If that fails, I’m going to look at the Active Partition Recovery and see if the free version will show me the partition boundary so I can manually create the partition again.

If that also fails, then I’ll run Gpart, and see if that does anything.

Of course, the final thing to do would be to use parted to create a 500G partition, and then a partition that uses up the remaining space. And if that doesn’t work, run the Fedora installer again and try recreating the partitions, being very careful not to select the ‘format’ option.

I also found a few programs that I’ll try to remember if/when it comes to other hard drive stuff:

Read the rest of this entry »

, , ,

1 Comment

Cheap hosting

I’ve perpetually been on the lookout for cheap hosting. Price’s the reason I’ve been hosting my stuff with Dreamhost for the past few years (going on 6 years!), and is the reason I’m going to be taking advantage of MaxCDN’s ‘free’ 1TB of CDN bandwidth this week. (Reasoing being that I don’t have much use for it now, but might in a bit, so extend my time as much as possible, so I’ll be getting it on Aug 31.)

It’s also the reason I’ve spent hours on LowEndBox after discovering it while looking for information about hosting a TF2 server. And oh my god, I’m so so tempted to sign up for some plans now even though I have no concrete plans for any uses at this point.

I can think of a few – host my own site with Varnish + nginx + PHP-FPM + memcache/accelerator for PHP, something that I can’t do with Dreamhost. Or use a site as a VPN endpoint so I can actually use Google Music, which has sideloaded the music on my phone onto it, but isn’t actually allowing my to access it (among other US restricted services).

I’m comparing all this to Amazon EC2’s free tier for what would be essentially a free VPS for a year (it covers a month worth of micro sized instance usage each month), and I can’t decide which is better though. Amazon is obviously free, but the micro size isn’t acceptable for TF2. And as a VPN endpoint, the 30GB of bandwidth is pretty low, and the fees per GB are high. 10 cents a GB means I’d only need to push more than 42GB, and I’d have paid more than the cheapest plan that I found cost. If I had stats of my usage, it’d be an easy decision. (Considering that I’ve already got an account with Amazon, I’m not eligible for the free tier, but alternate email address gets around that, unless they start restricting to 1 account/credit card number.)

Plans plans plans. And decisions. Anyway, a few things that caught my eye for future reference:

Read the rest of this entry »

, ,

No Comments