Saturday, November 22, 2014

Tuesday, November 18, 2014

tcp and udp

http://www.diffen.com/difference/TCP_vs_UDP

S3cmd tool

S3cmd sync
=========

 A bit more powerful is sync – the path names handling is the same as was just explained. However the important difference is that sync first checks the list and details of the files already present at the destination, compares with the local files and only uploads the ones that either are not present remotely or have a different size or md5 checksum. If you ran all the above examples you’ll get a similar output to the following one from a sync:
~/demo$ s3cmd sync  ./  s3://s3tools-demo/some/path/
dir2/file2-1.log -> s3://s3tools-demo/some/path/dir2/file2-1.log  [1 of 2]
dir2/file2-2.txt -> s3://s3tools-demo/some/path/dir2/file2-2.txt  [2 of 2]
As you can see only the files that we haven’t uploaded yet, that is those from dir2, were now sync‘ed. Now modify for instance dir1/file1-2.txt and see what happens. In this run we’ll first check with —dry-run to see what would be uploaded. We’ll also add —delete-removed to get a list of files that exist remotely but are no longer present locally (or perhaps just have different names here):
~/demo$ s3cmd sync --dry-run --delete-removed ~/demo/ s3://s3tools-demo/some/path/
delete: s3://s3tools-demo/some/path/file1-1.txt
delete: s3://s3tools-demo/some/path/file1-2.txt
upload: ~/demo/dir1/file1-2.txt -> s3://s3tools-demo/some/path/dir1/file1-2.txt
WARNING: Exiting now because of --dry-run
So there are two files to delete – they’re those that were uploaded without dir1/ prefix in one of the previous examples. And also one file to be uploaded — dir1/file1-2.txt, the file that we’ve just modified.
Sometimes you don’t want to compare checksums and sizes of the remote vs local files and only want to upload those that are new. For that use the —skip-existing option:
~/demo$ s3cmd sync --dry-run --skip-existing --delete-removed ~/demo/ 
              s3://s3tools-demo/some/path/
delete: s3://s3tools-demo/some/path/file1-1.txt
delete: s3://s3tools-demo/some/path/file1-2.txt
WARNING: Exiting now because of --dry-run
See? Nothing to upload in this case because dir1/file1-2.txt already exists in S3. With a different content, indeed, but --skip-existing only checks for the file presence, not the content.














http://s3tools.org/s3cmd-howto

https://github.com/s3tools/s3cmd



Bucket policy example

##############################################################

{
"Id": "Policy1416407058443",
"Statement": [
  {
   "Sid": "Stmt1416407056513",
   "Action": "s3:*",
   "Effect": "Allow",
   "Resource": "arn:aws:s3:::phab-server/*",
   "Principal": {
    "AWS": [
     "*"
    ]
   }
  }
]
}

##############################################################

Monday, November 17, 2014

Puppet - install and configuration

https://docs.puppetlabs.com/
https://www.digitalocean.com/community/tutorials/how-to-install-puppet-to-manage-your-server-infrastructure


Foreman to manage Puppet Nodes

How to optimize php

http://www.radinks.com/upload/config.php

http://www.cyberciti.biz/faq/apache-limiting-upload-size/

sed command

http://www.grymoire.com/unix/sed.html