Posts Tagged ‘directory’

Windows add command to PATH variable – Adding (Putty, VIM) new binaries folder to windows %PATH%

Friday, June 27th, 2014

windows-add-to-path-location-of-new-binaries-putty-vim-etc-system-properties-environment-variable-screenshot
I want to use vim (text editor) on my Windows 7 work computer and hence installed VIM (VI Improved) port for Windows GVim.
VIM works perfect on Windows and for those coming from UNIX background having it installed is a must, however vim doesn't include a PATH location to its (vim) executable in Windows %PATH%.
PATH command line variable stores path locations to all binaries that could be executed directly from cmd line with no need to type full directory path to binary.

To illustrate what PATH is lets say you want to make Putty accessible straight from Windows command line (cmd.exe), you can add Putty's installation Path Folder to global Windows %PATH%. On my 64-bit Windows PC Putty binary is installed in C:Program Files (x86)PuTTY.

echo %PATH%
 

C:Perl64sitebin;C:Perl64bin;C:Program FilesRA2HP;C:Windowssystem32;C:
Windows;C:WindowsSystem32Wbem;C:WindowsSystem32WindowsPowerShellv1.0;C:
Program FilesWIDCOMMBluetooth Software;C:Program FilesWIDCOMMBluetooth Sof
twaresyswow64;C:Program Files (x86)Hewlett-PackardHP ProtectTools Security M
anagerBin;C:Program FilesActivIdentityActivClient;C:Program Files (x86)A
ctivIdentityActivClient;C:Program Files (x86)QuickTimeQTSystem


To make Putty accessible only by typing Putty instead of typing C:Program Files (x86)PuTTYPutty.exe, it has to be included in PATH, normally from Win command line (cmd.exe). This is done with:

 

set PATH=%PATH%;C:Program Files (x86)PuTTY

 

echo %PATH%

 

C:Usersggeorgi7Desktop>echo %PATH%
C:Perl64sitebin;C:Perl64bin;C:Program FilesRA2HP;C:Windowssystem32;C:
Windows;C:WindowsSystem32Wbem;C:WindowsSystem32WindowsPowerShellv1.0;C:
Program FilesWIDCOMMBluetooth Software;C:Program FilesWIDCOMMBluetooth Sof
twaresyswow64;C:Program Files (x86)Hewlett-PackardHP ProtectTools Security M
anagerBin;C:Program FilesActivIdentityActivClient;C:Program Files (x86)A
ctivIdentityActivClient;C:Program Files (x86)QuickTimeQTSystem;C:Program
Files (x86)PuTTY

To check all the exported variables, use SET command, here is my default SET variables:

C:> SET
 

ALLUSERSPROFILE=C:ProgramData
APPDATA=C:UsersgeorgiAppDataRoaming
CLASSPATH=.;C:Program Files (x86)Javajre6libextQTJava.zip
CommonProgramFiles=C:Program FilesCommon Files
CommonProgramFiles(x86)=C:Program Files (x86)Common Files
CommonProgramW6432=C:Program FilesCommon Files
COMPUTERNAME=GEORGI
ComSpec=C:Windowssystem32cmd.exe
DEFLOGDIR=C:ProgramDataMcAfeeDesktopProtection
FP_NO_HOST_CHECK=NO
HOMEDRIVE=C:
HOMEPATH=Usersggeorgi7
LOCALAPPDATA=C:Usersggeorgi7AppDataLocal
LOGONSERVER=G1W4730
NUMBER_OF_PROCESSORS=4
OS=Windows_NT

To make inclusion of VIM, Putty or other binary directory) into PATH variable permanent:


From Windows desktop, right-click My Computer and click Properties.
In the Advanced system settings click on Properties window, click on the Advanced tab.
In the Advanced System section, click the Environment Variables button.


windows-add-command-to-path-variable-adding-new-folder-to-windows-path-add-putty-vim-to-easy-execute
Finally, in the Environment Variables window, search for the Path variable in Systems Variable section and click the Edit button.
Add or modify the path lines including the paths to binaries you wish to be easy access.
Note that each different directory should be separated with a semicolon (;), e.g.
 

C:Program Files;C:Winnt;C:WinntSystem32; C:Perl64bin

 

 

Preserve domain name after redirect with mod_rewrite and some useful mod rewrite redirect and other examples – Redirect domain without changing URL

Friday, July 11th, 2014

redirect_domain_name_without_changing_url_apache_rewrite_rule_preventing_host_in_ip_mod_rewrite
If you're a webhosting company sysadmin, sooner or later you will be asked by application developer or some client to redirect from an Apache webserver to some other webserver / URL's IP, in a way that the IP gets preserved after the redirect.

I'm aware of two major ways to do the redirect on webserver level:

1. To redirect From Apache host A to Webserver on host B using ReverseProxy mod_proxy

2. To use Mod Rewrite to redirect all client requests on host A to host B.

There is quite a lot to be said and is said and written online on using mod_rewrite to redirect URLs.
So in this article I will not say nothing new but just present some basic scenarios on Redirecting with mod rewrite and some use cases.
Hope this examples, will help some colleague sys-admin to solve some his crazy boss redirection tasks 🙂 I'm saying crazy boss because I already worked for a  start-up company which was into internet marketing and the CEO has insane SEO ideas, often impossible to achieve …

a) Dynamic URL Redirect from Apache host A to host B without changing domain name in browser URL and keeping everything after the query in

Lets say you want to redirect incoming traffic to DomainA to DomainB keeping whole user browser request, i.e.

Redirect:

http://your-domainA.com/whole/a/lot/of/sub/directory/query.php


Passthe the whole request including /whole/a/lot/of/sub/directory/query.php

so when Apache redirects to redirect to:

http://your-domainB.com/whole/a/lot/of/sub/directory/query.php

In browser 
To do it with Mod_Rewrite either you have to add in .htaccess mod_rewite rules:

RewriteEngine On
RewriteCond %{HTTP_HOST} ^your-domainA.com [OR]
RewriteCond %{HTTP_HOST} ^http://your-domainA.com
RewriteRule ^(.*) http://your-domainB.com/$1 [P]

or include this somewhere in VirtualHost configuration of your domain
 

Above mod_rewrite will make any request to your-domainA.com to forward to your-domainB.com while preserving the hostname in browser URL bar to old domain http://your-domainA.com, however still contet will be served by http://your-domainB.com
 

http://yourdomainA.com/YOUR-CUSTOM-REQUEST-ADDRESS


to redirect to

http://yourdomainB.com/YOUR-CUSTOM-REQUEST-ADDRESS


WARNING !!  If you're concerned about your SEO well positioning in search Engines, be sure to never ever use such redirects. Making such redirects will cause two domains to show up duplicate content
and will make Search Engines to reduce your Google, Yahoo, Yandex etc. Pagerank !!

Besides that such, redirect will use mod_rewrite on each and every redirect so from performance stand point it is a CPU killer (for such redirect using native mod_proxy ProxyPass is much more efficient – on websites with hundred of thousands of requests daily using such redirects will cause you to spend your  hardware badly  …)

P.S. ! Mod_Rewrite and Proxy modules needs to be previously enabled
On Debian Linux, make sure following links are existing and pointing to proper existing files from /etc/apache2/mods-available/ to /etc/apache2/mods-enabled

debian:~#  ls -al /etc/apache2/mods-available/*proxy*
-rw-r–r– 1 root root  87 Jul 26  2011 /etc/apache2/mods-available/proxy_ajp.load
-rw-r–r– 1 root root 355 Jul 26  2011 /etc/apache2/mods-available/proxy_balancer.conf
-rw-r–r– 1 root root  97 Jul 26  2011 /etc/apache2/mods-available/proxy_balancer.load
-rw-r–r– 1 root root 803 Jul 26  2011 /etc/apache2/mods-available/proxy.conf
-rw-r–r– 1 root root  95 Jul 26  2011 /etc/apache2/mods-available/proxy_connect.load
-rw-r–r– 1 root root 141 Jul 26  2011 /etc/apache2/mods-available/proxy_ftp.conf
-rw-r–r– 1 root root  87 Jul 26  2011 /etc/apache2/mods-available/proxy_ftp.load
-rw-r–r– 1 root root  89 Jul 26  2011 /etc/apache2/mods-available/proxy_http.load
-rw-r–r– 1 root root  62 Jul 26  2011 /etc/apache2/mods-available/proxy.load
-rw-r–r– 1 root root  89 Jul 26  2011 /etc/apache2/mods-available/proxy_scgi.load

debian:/etc/apache2/mods-avaialble:~# ls *proxy*
proxy.conf@  proxy_connect.load@  proxy_http.load@  proxy.load@


If it is is not enabled to enable proxy support in Apache on Debian / Ubuntu Linux, either create the symbolic links as you see them from above paste or issue with root:
 

a2enmod proxy_http
a2enmod proxy

 

b) Redirect Main Domain requests to other Domain specific URL
 

RewriteEngine On
RewriteCond %{HTTP_HOST} ^your-domainA.com
RewriteRule ^(.*) http://your-domainB.com/YOUR-CUSTOM-URL [P]

Note that no matter what kind of subdirectory you request on http://your-domain.com (lets say you type in http://your-domainA.com/My-monkey-sucks ) it will get redirected to:

http://your-domainB.com/YOUR-CUSTOM-URL

Sometimes this is convenient for SEO, because it can make you to redirect any requests (including mistakenly typed requests by users or Bot Crawlers to real existing landing page).

c) Redirecting an IP address to a Domain Name

This probably a very rare thing to do as usually a Domain Name is redirected to an IP, however if you ever need to redirect IP to Domain Name:

RewriteCond %{HTTP_HOST} ^##.##.##.##
RewriteRule (.*) http://your-domainB.com/$1 [R=301,L]

Replace ## with digits of your IP address, the is used to escape the (.) – dots are normally interpreted by mod_rewrite.

d) Rewritting URL extensions from .htm to .php, doc to docx etc.

Lets say you're updating an old website with .htm or .html to serve .php files with same names as old .htmls use following rewrite rules:. Or all your old .doc files are converted and replaced with .docx and you need to make Apache redirect all .doc requests to .docx.
 

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^(.*).html$ $1.php [NC]

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^(.*).doc$ $1.docx [NC]

The [NC] flag at the end means "No Case", or "case-insensitive"; Meaning it will not matter whether files are requested with capital or small letters, they will just show files if file under requested name is matched.

Using such a redirect will not cause Apache to redirect old files .html, .htm, .doc and they will still be accessible again creating duplicate content which will have a negavite impact on Search Engine Optimization.

The better way to do old extensioned files redirect is by using:

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^(.+).htm$ http://your-domainB.com/$1.php [R,NC]

[R] flag would tell make mod_rewrite send HTTP "MOVED TEMPORARILY" redirection, aka, "302" to browser. This would cause search engines and other spidering entities will automatically update their links to the new locations.

e) Grabbing content from URL with Mod Rewrite and passing it to another domain

Lets say you want zip files contained in directory files/ to be redirected from your current webserver on domainA to domainB's download.php script and be passed as argument to the script

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^files/([^/]+)/([^/]+).zip https://www.pc-freak.net/download.php?section=$1&file=$2 [R,NC]


f) Shortening URLs with mod_rewrite

This is ueful If you have a long URL address accessible via some fuzzy long hard to remember URL address and you want to make it acessible via a shorter URL without phyisally moving the files within a short named directory, do:

Options +FollowSymlinks
RewriteEngine On
RewriteRule ^james-brown /james-brown/files/download/download.php

Above rule would make requests coming to http://your-domain.com/james-brown?file=my.zip be opened via http://mysite/public/james-brown/files/download/download.php?file=my.zip

g) Get rid of the www in your domain name

Nowdays many people are used to typing www.your-domain.com, if this annoys you and you want them not to see in served URLs the annoying www nonsense, use this:

Options +FollowSymlinks
RewriteEngine on
RewriteCond %{http_host} ^www.your-domain.com [NC]
RewriteRule ^(.*)$ http://your-domain.com/$1 [R=301,NC]

That's mostly some common uses of mod rewrite redirection, there are thousands of nice ones. If you know others, please share?


References and thanks to:

How to redirect domain without changing the URL

More .htaccess tips and tricks – part 2

 

 

Linux find files while excluding / ignoring some files – Show all files on UNIX excluding hidden . (dot) files

Friday, August 22nd, 2014

linux-find-files-while-excluding-ignoring-some-files-show-all-files-on-unix-excluding-hidden-dot-files
A colleague of mine (Vasil) asked me today, how he can recursively chmod to all files in a directory while exclude unreadable files for chmod (returning permission denied). He was supposed to fix a small script which was supposed to change permissions like :

chmod 777 ./
chmod: cannot access `./directory': Permission denied
chmod: cannot access `./directory/file': Permission denied
chmod: cannot access `./directory/onenote': Permission denied

First thing that came to my mind was to loop over it with for loop and grep out only /directory/ and files returning permissioned denied.

for i in $(find . -print | grep -v 'permission denied'); do echo chmod 777 $i; done

This works but if chmod has to be done to few million of files, this could be a real resource / cpu eater.

The better way to do it is by only using Linux find command native syntax to omit files.

find . -type f ( -iname "*" ! -iname "onenote" ! -iname "file" )

Above find will print all files in . – current directory from where find is started, except files: onenote and file.
To exclude
 

Search and show all files in Linux / UNIX except hidden . (dot) files

Another thing he wanted to do is ignore printing of hidden . (dot) files like .bashrc, .profile and .bash_history while searching for files – there are plenty of annoying .* files.

To ignore printing with find all filesystem hidden files from directory:

find . -type f ( -iname "*" ! -iname ".*" )

on web hosting webservers most common files which is required to be omitted on file searches is .htaccess

find . -type f ( -iname "*" ! -iname ".htaccess" )

  In order to print only all hidden files in directory except .bashrc and .bash_profile:

find . -type f ( -iname '.*' ! -iname '.bashrc' ! -iname '.bash_profile' )

Another useful Linux find use for scripting purposes is listing only all files presented in current directory (simulating ls), in case if you wonder why on earth to use find and not a regular ls command?, this is useful for scripts which has to walk through millions of files (for reference see how to delete million of files in same folder with Linux find):

find . ! -name . -prune

./packages
./bin
./package

"! -name . " –  means any file other than current directory

prune – prunes all the directories other than the current directory.

A more readable way to list only files in current folder with find is – identical to what above cmd:

find ./* -prune

./packages
./bin
./mnt

If you want to exclude /mnt folder and its sub-directories and files with find by using prune option:

find . -name tmp -prune -o -print

 

 

Improve Websites SEO: Optimize images to Increase website loading performance on Linux server – Image Compress tools

Friday, December 5th, 2014

Optimize-website-images-pictures-to-Increase-website-loading-performance-on-Linux-server_Image_Compress_tools-Improve-Websites_SEO
Part of our daily life as Web hosting system adminstrators is to constantly strive to better utilize our Linux / Windows hosting servers hardware.
Therefore it is our constant task to look for new better ways to optimize our Apache Sites and Webservers in order to return served application content light fast to keep the Boss and customers happy 🙂

There are things to tune up for better server performance and better CPU / memory utilization on both server Application server side as well as the website programming code backend, html and pictures / images

Thus it is critically important to not only keep the Webserver / PHP engine optimized but keep hosted sites  stored images and source code clean and efficient.

We as admins usually couldn't directly interfere with clearning the source code and often we have to host a crappy written sites with picture upload forms with un-optimized Image files that was  produced on old Photo Cameras, "Ancient" Mobile Mobiles, Win XP MS Paint, various versions Photoshop, Gimp etc.).

It is a well known fact that a big part from a Website User Experience is how fast the user loads a page, thus if HTML / CSS loaded images loads slow has a negative impact on user look & feel about website

Therefore by optimizing the size of hosted sites Images, you Save Network bandwidth and in some cases when Large Gallery sites HDD disk space.

On Linux, there are already a many command line tools to inspect and optimize (compress) the size of PNG, JPEG, GIF, BMP, PNM, Tiff Images, most famous ones are:

  • optipng – PNG optimizer that recompresses image files to a smaller size, without losing any information.
  • jpegoptim –   lossless JPEG optimization (based on optimizing the Huffman tables) and "lossy" optimization based on setting a maximum quality factor.
  • pngcrush – Recommended tool to use by Stoyan Stefanov (Yahoo Yslow Developer)
  • jpegtran – Recommended to use by Google 
  • gifsicle –  command-line tool for creating, editing, and getting information about GIF images and animations. 

It is hence useful to first run manually availale Linux image optimization tools (to get an idea what they do) and later automate them to run as scripts to optimize server stored images size and make pictures load faster on websites and thus improve End Users Experience and speed up Image content delivery to GoogleBot / YahooBot / Bing Crawlers which will make Search Engines to position server hosted sites better (more SEO Friendly).

 

  • How much percents of  space (Mega / Gigabytes ) Pictures compress can save you?

If you run it on 500MB image directory, you can probably save about 20 to 50MB of size, so don't expect extraordinary file reduce, however 5% to 10% reduce in size is not bad too. If you host 100 sites each with half gigas of data this would mean saving of 5GB of data and some 5GB from backups 🙂 At extraordinary cases you can expect 20% to 30% of storage reduce. For even better image compression you can try out GIMP's – Save for Web option.
 

  • Installing jpegtran, optpng, jpegoptim, pngcrush gifsicle on Debian / Ubuntu (deb based) Linux
     

apt-get install –yes libjpeg-progs optipng jpegoptim pngcrush gifsicle

 

  • Installing  jpegtran, optpng, jpegoptim, pngcrush, gifsicle on Fedora / CentOS / RHEL (RPM based distros)
     

yum -y install pngcrush libjpeg-turbo-utils opt-jpg opt-png opt-gif


gifsicle is not availale by default on Redhacks 🙂 but there is a RPM package for fedora from http://pkgs.repoforge.org/gifsicle/

 

Some examples of running image compression on GNU / Linux

  • optipng and jpegoptim optimize for all files in directory
     

cd /home/sites/

find . -iname '*.png' -print0 | xargs -0 optipng -o7 -preserve
find . -iname '*.jpg' -print0 |
 xargs -0 jpegoptim –max=90 –strip-all –preserve –totals


In jpegoptim command, the option –strip-all will strip any metadata including Exif data from images. For websites JPEG metadata is usually not needed, so usually its ok to strip them.

Above jpegoptim example will decrease slightly JPEG image quality to 90%. quality level of 90 is still high enough and website visitors are unlikely to spot any visible quality reduction / defects in the image.

 

  • pngcrush all files in a directory example
     

cd /home/sites/

for png in `find $IMG_DIR -iname "*.png"`; do
    echo "crushing $png …"
        pngcrush -rem alla -reduce -brute "$png" temp.png

 

    # preserve original on error
    if [ $? = 0 ]; then
        mv -f temp.png $png
        else
        rm temp.png
        fi
done

  • Run jpegtran on sites directory
     

find /home/sites -name "*.jpg" -type f -exec jpegtran -copy none -optimize -outfile {} {} ;

 

  • Set a script to compress / reduce size of Sites Images


Here is a basic optimize_images.sh which I used earlier before and was reducing the overall images size just 5 to 10%, then I found the much improved version of optimize images shell script  (useful to  clear up EXIF picture data / And Comments from JPG / PNG files). The script execution could take very long time on large image directories and thus could cause a high HDD disk I/O, however if ran once a week at night time its not such a big deal. 

To set it to run on your server as a cronjob:
 

cd /usr/sbin/
wget -q https://www.pc-freak.net/bshscr/optimize_images2.sh
crontab -u root -e 


Sample cron job to run once a month on 10th and 27th in 3 o'clock AM:
 

 00 3 10,27 * * /usr/sbin/optimize_images2.sh 2>&1 >/dev/null


Also if you need to further optimize million of tiny sized PNG files Yahoo Smush.it service could be helpful. For compression maniacs its worthy to check out also TinyPNG Service (however be awre that this service compresses files with significant quality loss) making picture quality visibly deteriorated.

Besides optimizing server stored Pictures, here are some other stuff that helps in increasing server utilization / lower webpages loading time.

Starting up with the installation (when site is to use Apache + PHP) for its backend, the first thing to on the freshlyinstalled Linux server is to implement the following list of Apache common Timeout variables that help better scale the webserver for the CMS-es hosted, enable Webserver caching with (mod_deflate), enable eAccelerator tune PHP common php variable etc.

Other thing  I sometimes use to speed-up performance of Apache child responce time up to 20-30  is to Include into Virtualhost / httpd.conf Apache configuration any htacces mod_rewrite rules.

On too heavily loaded sites On-line stores / Large Company website portals with more than 60 000 – 100 000 unique IP visitors a day it is useful tip to disable completely Apache logging in access.log / error.log.

Often when old architecture websites are moved from older Linux OS version to a newer one with newer versions of Apache / PHP often sites are working without major code rework, but use many functions which are already obsolete and thus many WARNING messages crap is logged into php_error.log / error.log. Thus to save disk space and decrease hard disk I/O operations it is good to Disable PHP Notices and Warnings messages
 

Remove \r (Carriage Return) from string with standard bash shell / sed / tr / vim or awk – Replace \r hidden messy characters from files

Tuesday, February 10th, 2015

remove_r_carriage_return_from_string_with-standard-bash_shell_sed_tr_or_awk_replace_annoying_hidden_messy_characters_from_files

I've been recently writting this Apache webserver / Tomcat / JBoss / Java decomissioning bash script. Part of the script includes extraction from httpd.conf of DocumentRoot variable configured for Apache host.
I was using following one liner to grep and store DocumentRoot set directory into new variable:

documentroot=$(grep -i documentroot /usr/local/apache/conf/httpd.conf | awk '{ print $2 }' |sed -e 's#"##g');

Above line greps for documentroot prints 2nd column of the matchi (which is the Apache server set docroot and then removes any " chars).

However I faced the issue that parsed string contained in $documentroot variable there was mysteriously containing r – return carriage – this is usually Carriage Return (CR) sent by Mac OS and Apple computers. For those who don't know the End of Line of files in UNIX / Linux OS-es is LF – often abreviated as n – often translated as return new line), while Windows PCs use for EOF CR + LF – known as the infamous  rn. I was running the script from the server which is running SuSE SLES 11 Linux, meaning the CR + LF end of file is standardly used, however it seem someone has editted the httpd.conf earlier with a text editor from Mac OS X (Terminal). Thus I needed a way to remove the r from CR character out of the variable, because otherwise I couldn't use it to properly exec tar to archive the documentroot set directory, cause the documentroot directory was showing unexistent.

Opening the httpd.conf in standard editor didn't show the r at the end of
"directory", e.g. I could see in the file when opened with vim

DocumentRoot "/usr/local/apache/htdocs/site/www"

However obviously the r character was there to visualize it I had to use cat command -v option (–show-nonprinting):

cat -v /usr/local/apache/conf/httpd.conf

DocumentRoot "/usr/local/apache/htdocs/site/wwwr"


1. Remove the r CR with bash

To solve that with bash, I had to use another quick bash parsing that scans through $directory and removes r, here is how:

documentroot=${documentroot%$'r'}

It is also possible to use same example to remove "broken" Windows rn Carriage Returns after file is migrated from Windows to Liunx /  FreeBSD host:

documentroot=${documentroot%$'rn'}

 

2. Remove r Carriage Return character with sed

Other way to do remove (del) Windows / Mac OS Carriage Returns in case if Migrating to UNIX is with sed (stream editor).

sed -i s/r// filename >> filename_out.txt


3. Remove r CR character with tr

There is a third way also to do it with (tr) – translate or delete characters old shool *nix command:

tr -d 'r' < file_with_carriagereturns > file_without_carriage_returns

 

4. Remove r CRs with awk (pattern scanning and processing language)

 awk 'sub("$", "r")' inputf_with_crs.txt > outputf_without_crs.txt


5. Delete r CR with VIM editor

:%s/r//g


6. Converting  file DOS / UNIX OSes with dos2unix and unix2dos command line tools

For sysadmins who don't want to bother with writting code to convert CR when moving files between Windows and UNIX hosts there are dos2unix and unix2dos installable commands.

All done Cheers ! 🙂

Redirect http URL folder to https e.g. redirect (http:// to https://) with mod_rewrite – redirect port 80 to port 443 Rewrite rule

Saturday, July 17th, 2010

There is a quick way to achieve a a full url redirect from a normal unencrypted HTTP request to a SSL crypted HTTPS

This is achieved through mod_rewrite using the RedirectMatch directive.

For instance let’s say we’d like to redirect https://www.pc-freak.net/blog to https://www.pc-freak.net/blog.
We simply put in our .htacess file the following rule:

Redirect permanent /blog https://www.cadiabank.com/login

Of course this rule assumes that the current working directory where the .htacess file is stored is the main domain directory e.g. / .
However this kind of redirect is a way inflexible so for more complex redirect, you might want to take a look at mod rewrite’s RedirectMatch directive.

For instance if you inted to redirect all urls (https://www.pc-freak.net/blog/something/asdf/etc.) which as you see includes the string blog/somestring/asdf/etc. to (https://www.pc-freak.net/blog/something/asdf/etc then you might use some htaccess RedirectMatch rule like:

RedirectMatch permanent ^/blog/(.*)$ https://www.pc-freak.net.net$1
or
RedirectMatch permanent ^/blog/(.*)$ https://www.pc-freak.net.net/$1

Hopefully your redirect from the http protocol to https protocol with mod_rewrite rule should be completed.
Also consider that the Redirect directive which by the way is an Apache directive should be faster to process requests, so everywhere you can I recommend using instead of RedirectMatch which calls the external Apache mod_rewrite and will probably be times slower.

Linux: Rename all files extension from upper to lower cases

Friday, February 14th, 2014

Lets say you're an admin involved in webhosting and due to a programmer's mistake, you end up with directory full with files with extension in upper cases but for actual version of website (all pictures are red only in lower cases), hence would like to transform these to lower cases.
To give an example, to illustrate what I mean, lets say you have in a directory files like;

filename.JPG, picture.PNG, new-picture.GIF

and you would like all files to be renamed to lower extension characters, i.e.:

filename.jpg, picture.png, new-picture.gif.
 

# find . -name '*.*' -exec sh -c 'a=$(echo {} | sed -r "s/([^.]*)\$/\L\1/"); [ "$a" != "{}" ] && mv "{}" "$a" ' \;

That's all enjoy 🙂

GNU / Linux: Resuming failed / interrupted – scp, sftp, ftp)file or directory upload with rsync

Saturday, October 20th, 2012

Rsync SSH Logo Linux Continue failed upload logo

You probably wondered how and if it is possible to Continue / Resume interrupted SFTP / FTP, SCP commands file transfer ?

Continuing a failed Upload is something, very useful especially for people like me who use Linux over wireless and there are constant failures with Internet, or just have to move quickly from a Wireless location to another one while sftp user@host upload was in progress. SFTP protocol was not planned with a continues upload in mind, just like web HTTP proto (POST) file upload was not. Thus there is no direct way to continue file upload using some embedded SCP / FTP protocol feature.
This is really bad especially, if you’re uploading some enormous 10 Gigabytes files and upload interrupts on 95% percentage 😉

Thanksfully though default protos does not support continues downloads rsync does!

there is a possibility to continue SCP failed uploads!
Using rsync, you can continue uploading any failed upload nomatter the protocol supports resume or not 🙂

To continue an (interrupted) SCP, SFTP (SSH) or (FTP) proto transmission with rsync


# rsync --rsh='ssh' -av --progress --partial interrupted_file_name_to_be_uploaded.rar root@UPLOAD-HOST.COM:/root

If you need to resume failed upload to SSH server running on unusual port number use, let’s say SSH listening for connections on port 2202:


# rsync --rsh='ssh -p 2202' -av --progress --partial interrupted_file_name_to_be_uploaded.rar root@UPLOAD-HOST.COM:/root

For efficiency of upload for lage files it is also useful to use rsync’s file compression capabilities with -z switch:


# rsync --rsh='ssh -p 2202' -avz --progress --partial interrupted_file_name_to_be_uploaded.rar root@UPLOAD-HOST.COM:/root

Sometimes its necessery to resume a failed ( upload ) transfer of a directory with some sub directory structure to do so:


# rsync --rsh='ssh -p 22' -avztrlpog --progress --partial interrupted_file_name_to_be_uploaded.rar root@www.pc-freak.net:/root

Here is short explained each rsync switches you see above:


-e ssh rsync will use ssh client instead of rsh
-z compress file transfer
-t preserve time (other attributes as owner or permissions are also possible)
-l copy symlinks as symlinks
-r recursive into subdirectories
-p preserve permissions
-o preserve owner
-g preserve group
-v verbose

-tplog key switches are very useful as they will keep file creation and modification times, exact permissions, owner and group permissions and copy symlinks correctly.

Allow Directory Listing in Apache Webserver / Get around Directory index forbidden by Options directive

Thursday, October 4th, 2012

I have configured Apache VirtualHost, inside the VirtualHost hosted domain, it is supposed to be a directory, where Directory Listing has to be allowed. My VirtualHost configuration looks like so:


NameVirtualHost *

ServerAdmin my-email@domain-name.com
ServerName www.pc-freak.net
ServerAlias www.domain-name.com domain-name.com
DocumentRoot /var/www
DirectoryIndex index.html index.htm index.php index.html.var

Options FollowSymLinks
AllowOverride All
Order allow,deny
allow from all


Options Indexes FollowSymLinks MultiViews
AllowOverride All
Order allow,deny
allow from all


Options Indexes FollowSymLinks MultiViews
AllowOverride All
Order allow,deny
allow from all

ScriptAlias /cgi-bin/ /usr/lib/cgi-bin/

AllowOverride None
Options +ExecCGI -MultiViews +SymLinksIfOwnerMatch
Order allow,deny
Allow from all

ErrorLog ${APACHE_LOG_DIR}/error.log
LogLevel warn
CustomLog ${APACHE_LOG_DIR}/access.log combined

I have a directory (/var/www/directory), there I store various files and I prefer this directory to be enabled to support Directory listing. I have the whole situation on Debian Linux. By default in Debian Apache is configured to disable directory listing for subdirectories to both default host and Virtualhosts

In order to enable /var/www/directory, accessed inside browser via web address http://wwww.domain-name.com/directory/ I had to add inside my Virtualhost /etc/apache2/sites-available/domain-name.com following Apache directive:



AddDefaultCharset UTF-8
Options FollowSymLinks Indexes
AllowOverride All

As you can see I included also AddDefaultCharset UTF-8, because inside /directory I have files in cyrillic and, if I don’t explicitly set the encoding to UTF-8, the htmls are improperly shown in browsers.

The exact directive that enables directory listing in Apache is:


Options Indexes

Setting Indexes to -Indexes disables directory listing, e.g.



Options -Indexes

BTW if you need to make certain directory accessible for default set Apache Options (permissions) should be set in /etc/apache2/apache2.conf



Options Indexes
...

This will set Apache directory permissions for all Virtualhost, useful if all virtualhosts share common ServerRoot and the directory has to be accessible via all vhosts.
Well that’s all Cheers 😉

How to list Files in a directory and generate web URLS with PHP

Tuesday, April 10th, 2012

I needed a short PHP script that reads all, my .html files in a directory and then generates html a hrefs links pointing to each of the html files stored in the directory.

Here is the short code I come up:

$directory_to_open=”my-dir/”;
$max_files=100;
$i=0;
if ($handle = opendir(“$directory_to_open”)) {
while (false !== ($file = readdir($handle)) && $i <= $max_files)
{
$i=$i+1;
if ($file != “.” && $file != “..”)
{
$thelist .= ‘| ‘.str_replace(“.html”,””,$file).’ |’;
echo “$thelist”; }
}
closedir($handle);
}

In my case the directories with html were planned to contain, less than 100 files a directory, so in order to show links to only the first 100 files in the directory, I used the $max_files=100 and a check if value is reached in the while loop. For anyone who want to build html you see in above while if $max_files is reached then the while loop exits.

Because by default the files returned contained the naming format file_name.html, whether I wanted to show only the file name without the .html extensions used str_replace(); to get rid of file extensions string.