MySQL Backup to DropBox Solution – PHP Script

I have been using several mysql backup and syncing scripts now for different applications. I decided to combine a few and create a sort of all-in-one backup solution that backs up to a DropBox folder.

It uses the “DropBox Uploader” class from http://jaka.kubje.org/projects/dropbox-uploader/
It backs up multiple databases.

Download here: http://code.google.com/p/php-mysql-dropbox-backup/

Disclaimer: Some of the snippets of code are from existing scripts I have used over the years, so I don’t remember all sources. I this happens to be your code snippet let me know, and I’ll include the proper credits.

Eve Online Mac OS X cpu throttle

I’ve been playing Eve Online for a while now and sometimes I wanted to run it in the background but due to the cpu load Eve takes this is not really possible.

So I searched and found a program called cputhrottle (more info here)

It enables you to set the cpu usage of a process as a percentage.

So lets say Eve usually takes 80% cpu, with cputhrottle you can set this to 40%.

All great stuff, but grew tired of constantly running the Activity Monitor > locating Eve’s process id > then go into Terminal and run cputhrottle.

So using my tiny knowledge of applescript I wrote a scriptlet to do this process for you.

Here is the app: EveCPUThrottle.app.tgz
and here is the source scriptlet: EveCPUThrottle.scpt.tgz

Feel free to modify whatever you like.

If you find it usefull feel free to donate some ISK to me, my in-game character name is “Vorlean”

PCI Compliance – Continued

So after many both my own nmap and openssl and the PCI tests we are finally PCI compliant.

For MediaTemple customers however one thing kept failing the test.

This was the port 8443 which used by Plesk Virtuozo Service and is caused by having the “Offline Management” enabled.

To solve this I asked Mediatemple for help and they gladly disabled the “Offline Management”. This however did not completely solve the issue, since it still left the port open, and since Plesk was still installed it would still overwrite any customization I did to the iptables to block this port, even if I would go though the process of adding it correctly (at least what I could find).

So to solve this I did the following:

I edited the crontab:

crontab -e

and added the block line for iptables:
* * * * * /sbin/iptables -A INPUT -p tcp --dport 8443 -j REJECT

additionally to make sure I did not risk locking myself out I also added a ACCEPT line for my SSH port:
* * * * * /sbin/iptables -A INPUT -p tcp --dport MYPORTNUMBER -j ACCEPT

Cron runs this every minute so as soon as Plesk overwrites the iptables rules, the crontab runs and it’s added again.

This solved the issue which failed the PCI compliance test for me.

In theory this should also work with “Offline management” enabled which is going to be my next experiment. Additionally I am going to try adding an exception for my ip into the iptables to se if that works.

For now though we are PCI compliant and the cron will, for now, make sure it stays that way until I find a more permanent solution or Mediatemple updates the Plesk installation.

PCI Compliance, Weak SSL Ciphers, Plesk, etc

For all those struggling with the marketing stunt that is PCI compliance here are some pages I found that help to make our DV Base at Mediatemple pass the PCI test

Please check out the following links for help on this:

Weak SSL:
465 (smtps/qmail) – http://www.qmailwiki.org/index.php/Qmail-control-files#control.2Ftlsserverciphers
imap/pop – follow instructions http://www.oscommerceuniversity.com/lounge/index.php?topic=265.0

Server wide ssl2 disable and weak ciphers for all virtual domains:
create new file:
vi /etc/httpd/conf.d/zz000_psa_httpd_weak_ssl_disable.conf
press ‘i’ to insert
SSLProtocol ALL -SSLv2
SSLCipherSuite ALL:!ADH:!EXPORT:!SSLv2:RC4+RSA:+HIGH:+MEDIUM:-LOW

press ‘esc’ > ‘shift+q’ > ‘wq’ > ‘enter’
/etc/init.d/httpd stop
/etc/init.d/httpd start

Also found this page very helpful: http://www.linux-advocacy.org/web-servers/making-plesk-more-pci-compliant

Question and additions welcome of course..

Aperture 3.0.1 update

From http://support.apple.com/kb/TS2518:

This update improves overall stability and addresses a number of issues in Aperture 3, including (abridged):

  • All the features we added with this new version…
  • Some existing features that didn’t really work all that good anymore with the new version…
  • and much more.

And, before I forget, does it make a difference? It did for me, most notably, my loading time was improved and browsing through the pictures was snappier.

I have some remaining face detection running and its using a fair chunk of cpu (130%+). Also just having Aperture sit idle takes up about 850mb+ memory.

Aperture 3 first impressions

So my copy arrived today and I just finished installing it.
I won’t go so far as to call this a review since there are much more knowledgeable people out there to do that, but I do want to share some initial impressions I have at first launch.

My setup for using Aperture 3: Macbook Pro 2.2GHz C2D, 4Gb ram, 120GB HD. I have about 4500 RAW pics of which most (80%) of them are stored on a WD MyBook Studio via FW800.

  1. At first launch you are presented with the library upgrade screen which I started. Total upgrade took 41 minutes as the only program running (and since it hogs most of your cpu and memory while it does this it makes it pretty clear it wants to stay the ONLY thing running).
  2. First question after launch: Would you like to see your photos on a map? I answered yes. Activity window showed it took about 2 min for 250+ pics
  3. Activity window also shows the ‘Faces Detection’ running for my pics, and as of writing this blog is still going. From the time it has taken up till now I estimate its going to take about the same as upgrading.
  4. While all of this is going on browsing your pictures is slow, but that might improve once the initial setup actions are done. I’ll update this post once I know.
  5. There are dedicated Facebook, MobileMe and Flickr buttons at the top which when pressed prompt you to setup your account, I pressed the flickr button which launches the browser and directs you to flickr where you authorize the Aperture Uploader. I set it up and here is a link to my flickr upload from Aperture

**UPDATE**
Also checkout this bloggers experience with Aperture 3:
http://ishotalot.com/2010/02/turning-the-other-cheek-on-apertures-faces/

Finally……Aperture 3 is released

Almost 2 years to date since the release of version 2.0 (feb. 12 ’08) Apple release version 3.0: http://www.apple.com/aperture/

I am personally very happy since I still use Aperture (instead of Lightroom). Mainly this was due to the fact that I did have the time to research transferring over to Lightroom which might or moght not have been an easy project.

Anyway, v3.0 is ordered and on it’s way let’s wait and see what it brings.

Comments overload…

Since I receive mainly spam in my comment box (Akismet works great but still have to look through to double check) I going to try reducing it by limiting the ability to comment to registered users only.

Let’s see if that reduces comment spam. Apologies for the extra steps this involves.

Easily moving your ‘Mobile Applications’ folder

Since my iPhone app folder started to become quite substantial (4.5GB +) I wanted to move these files without a lot of hassle.

It should be noted that my iTunes configuration is somewhat different due to the fact that I use Libra (http://www.sillybit.com/libra/) to manage a few different libraries on my Mac.

Since I already had all the audio and video files stored on an external drive I just needed to find an easy way of doing the same thing with my applications.

This is what I did:

– Move the ‘Mobile Applications’ folder to my external drive
– Open up iTunes to find that iTunes can’t find the files anymore and has no way batch changing all the links to the new location.
– Spend some time looking online
– Spend some more time not finding a simple solution
– Getting an idea for my own solution
– Creating an ‘alias’ folder by option-command dragging the copied Mobile Applications folder to the original location of the folder (in my case ~/Music/iTunes) and making sure the name says Mobile Applications
– restarting iTunes to find it’s having no problem finding the files and everything working like it should.

So there it is.. It was as easy as moving the folder and creating an alias in the original location, at least for me this worked…

Update: It seems the above steps led to my entire app library being copied to the iPhone (including the ones that I had previously removed but still had in the library). This could be because I restarted iTunes after copying and removing the original folder before I created the alias. So just keep in mind, that this could lead to you having to reselect which apps you want to copy to the iPhone, or like I’m doing in my case, take this opportunity to start removing the ones you don’t want to put on there anymore.

New Development Horizons – Getting into iPhone Dev

Today I enrolled into the iPhone Development Program. Since I have no objective C, C++ or C- knowledge I looked for tools beforehand in order to make the step into development simpler.

I found a couple:

Now the difference between PhoneGap, Titanium and Corona is that both PhoneGap and Titanium are tools to help you write iPhone apps based on html and javascript including many of the popular libraries (prototype, mootools, jquery, etc.) thereby essentially removing the learning curve of iPhone dev for web designers. Also both are free downloads.

Corona on the other hand is a different matter all together. It let’s you write in Lua. Therefore Corona does have a learning curve, but (keep in mind I don’t know enough about iPhone dev to say this with complete certainty ) might provide more advanced development possibilities.

As of now I will start out using PhoneGap, and when things go smoothly and I have the time to mess about with the Corona Trail Download I might even purchase the Corona Package ($99) and use that since the Lua language seems logical and easy to learn.

I’ll post regularly with updates regarding my Dev journey.

New website launched for Mew fans | Mewlive.com

Recently I launched the new website for Mew fans looking for the Mew – Live in Copenhagen DVD that is somewhat hard to find.

For more info checkout the website: mewlive.com

**UPDATE** Due to some IP changes (switching to dedicated IP) mewlive.com may refer back here. If it does, just check back later since it just takes some time for the new IP to propagate and direct you to the right page. all ok now!

MediaTemple Grid Service (GS) Experience

Since March 09 I have been hosting my blogs EcoTipsForLife.com and my website move2create.com at MediaTemple.com

I have been using the Grid (GS) service and can report that I haven’t had any outages (that where not my doing of course…) or issues since.

Hosting is reliable, system maintenance is announced well ahead and often scheduled well in advance and customer service is very fast, responsive and helpfull.

All in all, I am a very happy MT customer.

On top of that I am only paying a total of $13.33 a month ($160/year) for their $20 a month Grid Service.

This is because I used the coupon “retailmenot” which takes off 20% of any plan for the life of the plan, and since I paid yearly the discount was quite substantial.

For more info or to order service from MediaTemple click here

Website Baker – Different CSS backgrounds on every page with one css file

I was creating a new website and decided to use a Website Baker template I had previously made as a base.

This template used the a bit of PHP code you can find on www.alistapart.com/d/randomizer/rotate.txt which automatically randomly rotates between images in a specified directory.

However for the new website this was not what I needed. I wanted to change the background of the page automatically, but not randomly.

My solution was the following.

Step 1: I modified Website Baker’s frontend.functions.php file, which you can find in the framework folder and added the following function to display the page id:

// Function for page id
if (!function_exists('page_id')) {
function page_id($spacer = ' - ', $template = '[PAGE_ID]') {
$vars = array('[PAGE_ID]');
$values = array(PAGE_ID);
echo str_replace($vars, $values, $template);
}
}

Step 2: I added the css code to the header of my index.php template file (this way the background image loads properly and I am easily able to use the page_id function):

<style type="text/css">
#my_div { background-image:url(<?php echo TEMPLATE_DIR; ?>/theimagesdirectory/<?php page_id();?>.jpg);}
</style>

and pointed it to the images directory containing the images.

Now I can just put an image into the images directory and name it XX.jpg (where XX is the corresponding page id) to control what image is being displayed for a certain page.

Of course there might be easier ways of doing this, but I am not a seasoned PHP developer, so I tend to go with whatever I can get to work….. and this is working great for my specific needs.

Tip: There are also several ways of using php directly in your css file. One way I have been using, since I need to compress the CSS file anyway, is to put this code at the top of your css file:

<?php

header('Content-type: text/css');

ob_start("compress");
function compress($buffer) {
$buffer = preg_replace('!/\*[^*]*\*+([^/][^*]*\*+)*/!', '', $buffer);
$buffer = str_replace(array("\r\n", "\r", "\n", "\t", ' ', ' ', ' '), '', $buffer);
$buffer = str_replace('{ ', '{', $buffer);
$buffer = str_replace(' }', '}', $buffer);
$buffer = str_replace('; ', ';', $buffer);
$buffer = str_replace(', ', ',', $buffer);
$buffer = str_replace(' {', '{', $buffer);
$buffer = str_replace('} ', '}', $buffer);
$buffer = str_replace(': ', ':', $buffer);
$buffer = str_replace(' ,', ',', $buffer);
$buffer = str_replace(' ;', ';', $buffer);
return $buffer;
}

?>

And this in the bottom:

<?php ob_end_flush();?>

Change the extension of the file from .css to .php and load it in the header using: <link rel="stylesheet" type="text/css" href="<?php echo TEMPLATE_DIR; ?>/css/style.php" />

There are many resources (http://websitetips.com/articles/optimization/css/crunch/ and http://forums.oscommerceproject.org/index.php?showtopic=691&pid=6347&st=0 to name a few) which show you how to compress your css files in this way with the added bonus that you can then use php code within your css stylesheets.

New hosting for move2create.com | Virpus VPS

As of last month I relocated move2create.com to it’s new home on my VPS I recently got from virpus.com.

As this was my first VPS I expected quite some work, but with the help of resources like vpsmedia.com/articles it proved to be less work then I had imagined. My experiences as a beginner VPS user with virpus.com have been excellent and would definitely recommend them.

virpus

Odiogo added to GeekyNomad.com

GeekyNomad.com post can now be listened to through the service of Odiogo.com.

You can do this by using the “listen now” button or by using the “subscribe now” button in the sidebar.

Tip: Just wanted to include my experience regarding the plugin. For me it wasn’t immediately clear how I could manually determine the location of the plugin “listen now” button. Then I stumbled across a line in odiogo_listen_button.php (wp-content > plugins > odiogo_listen_button) which I had overlooked.

Around line 113 there is an option that says:

$odiogo_adv_options['manually_insert_listennow_link'] = false;

Set this to true and use the code:

<?php odiogo_listennow();?>

to include the listen button anywhere you want in your template files.

UPDATE: In one of the updates of de Odiogo button this feature (meaning $odiogo_adv_options['manually_insert_listennow_link'] = false;) was moved from around line 113 to around line 57.

A Naymz.com RepScore of 8 in 20 minutes

A couple of months ago I got an email invitation for Naymz.com and decided to sign up.

Not having paid much attention to it for a while I recently stumbled across this post: http://collinlahay.com/2008/08/07/link-building-with-naymz/ on Collin Lahay’s website wich had some excellent tips on how to improve your Naymz profile. After reading I decided to try and improve my own Naymz RepScore. The RepScore is a Naymz.com score system determined by factors like Profile Completeness, connections, references, ID verification, etc.

My starting score was a 3/10.

The following is a step by step detailing my steps that resulted in getting a good Naymz.com RepScore.

  1. Create a free profile: http://www.naymz.com
  2. Fill in as much personal/business information as possible, this includes:
  1. Name, Residence, Occupation & Media
  2. Your Resume, can even be nearly empty like Collin Lahay’s post mentioned
  3. A Photo
  4. Contact Information
  5. Email adres | Add more of your emails for additional points
  6. A short About… section
  1. Add links from any personal, company and or blog websites you have (e.g. I also included a link to my Fotolia Portfolio). Also make sure to include the feed links if your blog and or website has them
  2. Invite family, friends and colleages to join
  3. Invite family, friends and colleages to leave you references
  4. Add your “Tags”
  5. Verify your identity | Now this was a very important step towards improving my score since this adds up to 250 points (for a RepScore point breakdown edit your Naymz Profile and click on “RepScore Details” and then any of the RepScore links). Naymz.com uses a service called Trufina in order to verify your identity. Now whether or not Trufina is truely insurance to prove “you are really you” is not really important at this point. Naymz.com uses it and it’s a good way to instandly improve your RepScore. Using the Trufina Coupon Code (c4ub8y3n | $3.95 discount) I found here I ended up paying only about $11 for both a “Trufina ID Badge” and a “Criminal Background Check” which I think is a good investment.

All in all the result was a RepScore of 8/10 as you can see on my profile here: http://www.naymz.com/search/thomas/korthuis/2071758 and all without paying the $9.99/m premium service naymz.com offers.

All this took me about 20 min. This time was also reduced because I already had a lot of info on hand from other social networking websites (e.g. my LinkedIn profile) as you may also already have.