Monthly Archives: January 2014

Focal Length

Going bird hunting this weekend, so I got myself a big gun. The Canon 70-200mm f/2.

I also got the 2x tube (or whatever it is called) to get as close to the birds as I can.

So how close is close?

All photos were done on a Canon 7D, which uses a 1.6x crop sensor so the photos should be 1.6x more zoom than if it was shot on a full frame camera.

I snapped some shots at different focal lengths to see. My target for these sample photos was a construction crain, specifically the driver control box. Here are the results (click for larger image).

Construction Crain @ 15mm
Construction Crain @ 50mm
Construction Crain @ 50mm
Construction Crain @ 70mm
Construction Crain @ 200mm
Construction Crain @ 200mm
Construction Crain @ 360mm

I’ve always thought 200 x 2 would be 400, but apparently not. With the 2x adapter on the 70-200mm lens the EXIF data is reporting the final shot is actually taken with a focal length of 360mm.

Making Cloud Hosting Work

I love the concept of cloud hosting, but the numbers just don’t work. A few years ago I did a calculation of CPU power vs. Amazon compute units.

The bottom line was if you needed a server online 24/7 it was just not cost effective. I know Amazon has a lot of clients and people can make it work – but I can’t get the numbers to work.

73.55I tried again today to look at some numbers. I’ve got a pair of email servers and in the last 24 hours consumed about 74gigs of bandwidth. Amazon now has free inbound bandwidth so lets cost only outbound bandwidth at 35 gigs per day, or 1050gigs per month.

Just for bandwidth alone Amazon would charge $124.20 a month.

That does not include the TWO servers that I would need to pay for. The servers are SuperMicro i5 based machines, not exactly sure where the performance would line up with Amazon cloud instances but I will estimate conservatively at c3.large class servers which have 3.7gb ram and 7 ECU units. Lets check the numbers under a long term contract with 24/7 usage.

The two Windows based servers would cost $270.68 per month.

115 Million DNS RequestsHow about adding DNS to the list. I am including a snapshot from one of my DNS servers (I have 4 total), you can see it has been online for 32 days and has processed 115,937,886 requests or 3,623,058 requests per day.

The DNS service (Route 53) would cost $58.50 per month.

So lets recap:

Servers: 270.68
Bandwidth: 124.20
DNS: 58.50 (I actually have 4 of them, not just one)

For a grand total for the month of $453.38.

Hate to say it Amazon, but I pay about $100 to host my two servers every month. The DNS software was a one time cost, years ago. The servers are SSD RAID based with 8gb of ram, total cost was about $1000 each and should last at least 3 years but probably much longer ($27.75 a month if they last 3 years).

So the numbers for a cloud based service still do not work for me 🙁

PHP session_start makes PHP single threaded

By no means do I consider myself a PHP expert, but I have been using it for quite a few years to build various websites.

If you are doing any websites with a login, you probably need to use PHP sessions to keep track of the user’s login status, preferences or who knows what.

Anytime I need to track a session, I toss this into the PHP code:

session_start();

That initiates user session tracking in PHP. Looks OK to me, been working great for years.

This past week I was developing some PHP/SQL scripts that take a long time to run. It was like if I was running the script, the website would not respond in another browser window.

I assumed the problem was related to mySQL, assumed there was a lock on the database until the first request was done.

After a couple of hours of researching I determined it was not mySQL, or my framework…

So I started from the ground up, the hello world code. That works, I can hammer it hundreds of times per second and there was no performance problem.

Yet running that single script that took maybe 30 seconds to run would bring the site down, no other page would respond.

Finally adding code back into my test environment I found it. session_start()

The default session handler drops the session information to disk. To prevent corruption the same user (same session) can’t access the same session data. Essentially meaning that PHP becomes single threaded on a per user basis. Normally this is not an issue for a couple of reasons 1) webpages are normally very quick 2) users don’t often execute many pages on the same site at the same time.

In my case the script tooks a considerable amount of time, I was unable to get results from a second page. To get around this problem you could use a different session handler (like storing sessions in Memcache).

Another way to fix it, do not leave the session open all day. When you need the session, open the session, access your variable and close it.

session_write_close();

Now other threads that may be called by the same user don’t get hung up waiting until the first script ends. PHP will automatically close the session and release the lock when the script ends…. but don’t wait. Close it with session_write_close(); when you are done with it.