Posts Tagged ‘processing’

Using perl and sed to substitute strings in multiple files on Linux and BSD

Friday, August 26th, 2011

Using perl and sed to replace strings in files on Linux, FreeBSD, OpenBSD, NetBSD and other UnixOn many occasions when had to administer on Linux, BSD, SunOS or any other *nix, there is a need to substitute strings inside files or group of files containing a certain string with another one.

The task is not too complex and many of the senior sysadmins out there would certainly already has faced this requirement and probably had a good idea on files substitution with perl and sed, however I’m quite sure there are dozen of system administrators out there who did not know, how and still haven’t faced a situation where there i a requirement to substitute from a command shell or via a scripting language.

This article tagets exactly these system administrators who are not 100% sys op Gurus 😉

1. Substitute text strings inside files on Linux and BSD with perl

Perl programming language has originally been created to do a lot of text manipulation as well as most of the Linux / Unix based hosts today have installed working copy of perl , therefore using perl as a mean to substitute one string in a file to another one is maybe the best way to completet the task.
Another good thing about perl is that text processing with it is said to be in most cases a bit faster than sed .
However it is still dependent on the string to be substituted I haven’t done benchmark tests to positively say 100% that always perl is quicker, however my common sense suggests perl will be quicker.

Now enough talk here is a very simple way to substitute a reoccuring, text string inside a file with another chosen one is like so:

debian:~# perl -pi -e 's/foo/bar/g' file1 file2

This will substitute the string foo with bar everywhere it’s matched in file1 and file2

However the above code is a bit “dangerous” as it does not preserve a backup copy of the original files, where string is substituted is not made.
Therefore using the above command should only be used where one is 100% sure about the string changes to be made.

Hence a better idea whether conducting the text substitution is to keep also the original file backup under a let’s say .bak extension. To achieve that I use perl as follows:

freebsd# perl -i.bak -p -e 's/syzdarma/magdanoz/g;' file1 file2

This command creates copies of the original files file1 and file2 under the names file1.bak and file2.bak , the files file1 and file2 text occurance of strings syzdarma will get substituted with magdanoz using the option /g which means – (substitute globally).

2. Substitute string in all files inside directory using perl on Linux and BSD

Every now and then the there is a need to do manipulations with large amounts of files, I can’t right now remember a good scenario where I had to change all occuring matching strings to anther one to all files located inside a directory, anyhow I’ve done this on a number of occasions.

A good way to do a mass file string substitution on Linux and BSD hosts equipped with a bash shell is via the commands:

debian:/root/textfiles:# for i in $(echo *.txt); do perl -i.bak -p -e 's/old_string/new_string/g;' $i; done

Where the text files had the default txt file extension .txt

Above bash loop prints each of the files located in /root/textfiles and substitutes everywhere (globally) the old_string with new_string .

Another alternative to the above example to replace multiple occuring text string in all files in multiple directories is possible using a combination of shell commands grep, perl, sort, uniq and xargs .
Let’s say that one wants to match everywhere inside the root directory and all the descendant directories for files with a custom string and substitute it to another one, this can be done with the cmd:

debian:~# grep -R -files-with-matches 'old_string' / | sort | uniq | xargs perl -pi~ -e 's/old_string/new_string/g'

This command will lookup for string old_string in all files in the / – root directory and in case of occurance will substitute with new_string (This command’s idea was borrowed as an idea from http://linuxadmin.org so thx.).

Using the combination of 5 commands, however is not very wise in terms of efficiency.

Therefore to save some system resources, its better in terms of efficiency to take advantage of the find command in combination with xargs , here is how:

debian:~# find / | xargs grep 'old_string' -sl |uniq | xargs perl -pi~ -e 's/old_string/new_string/g'

Once again the find command example will do exactly the same as the substitute method with grep -R …

As enough is said about the way to substitute text strings inside files using perl, I will further explain how text strings can be substituted using sed

The main reason why using sed could be a better choice in some cases is that Unices are not equipped by default with perl interpreter. In general the amount of servers who contains installed sed compared to the ones with perl language interpreter is surely higher.

3. Substitute text strings inside files on Linux and BSD with sed stream editor

In many occasions, wether a website is hosted, one needs to quickly conduct a change in string inside all files located in a directory, to resolve issues with static urls directly encoded in html.
To achieve this task here is a code using two little bash script loops in conjunctions with sed, echo and mv commands:

debian:/var/www/website# for i in $(ls -1); do cat $i |sed -e "s#index.htm#http://www.webdomain.com/#g">$i.new; done
debian:/var/www/website# for i in $(ls *.new); do mv $i $(echo $i |sed -e "s#.new##g"); done

The above command sed -e “s#index.htm#http://www.webdomain.com/#g”, instructs sed to substitute all appearance of the text string index.htm to the new text string http://www.webdomain.com

First for bash loop, creates all the files with substituted string to file1.new, file2.new, file3.new etc.
The second for loop uses mv to overwrite the original input files file1, file2, file3, etc. with the newly created ones file1.new, file2.new, file3.new

There is a a way shorter way to conclude the same text substitutions task using a simpler one liner with only using sed and bash’s eval capabilities, here is how:

debian:/var/www/website# sed -i 's/old_string/new_string/g' *

Above command will change old_string to new_string inside all files in directory /var/www/website

Whether a change has to be made with less than 1024 files using this method might be more efficient, however whether a text substitute has to be done to let’s say 5000+ the above simplistic version will not work. An error of Argument list too long will prevent the sed -i ‘s/old_string/new_string/g’ to complete its task.

The above for loop 2 liner should be also working without problems with FreeBSD and the rest of BSD derivatives, though I have not tested it yet, hence any feedback from FreeBSD guys is mostly welcome.

Consider that in order to have the for loops commands work on FreeBSD or NetBSD, they have to be run under a bash shell.
That’s all folks thanks the Lord for letting me write this nice article, I hope it gives some insights on how multiple files text replace on Unix works .
Cheers 😉

How to check Host is up with Nagios for servers with disabled ICMP (ping) protocol

Friday, July 15th, 2011

At the company where I administrate some servers, they’re running Nagios to keep track of the servers status and instantly report if problems with connectivity to certain servers occurs.

Now one of the servers which had configured UP host checks is up, but because of heavy ICMP denial of service attacks to the servers the ICMP protocol ping is completely disabled.

In Nagios this host was constantly showing as DOWN in the usual red color, so nagios reported issue even though all services on the client are running fine.

As this is quite annoying, I checked if Nagios supports host checking without doing the ICMP ping test. It appeared it does through something called in nagios Submit passive check result for host

Enabling the “Submit passive check result for this host” could be done straight from Nagios’s web interface (so I don’t even have to edit configurations! ;).
Here is how I did it. In Nagios I had to navigate to:

Hosts -> Click over my host (hosting1) which showed in red as down

Nagios disable ICMP ping report for hosts

You see my down host which I clicked over showing in red in above pic.

On next Nagios screen I had to select, Disable active checks of this host

Nagios Disable active ICMP checks of this host
and press on the Commit button.

Next following text appears on browser:

Your command request was successfully submitted to Nagios for processing.

Note: It may take a while before the command is actually processed.

Afterwards I had to click on Submit passive check result for this host and in:
Check Output to type in:

check_tcp -p 80

Here is the Screenshot of the Command Options dialog:

Nagios submit passive check with check TCP -p 80

That’s all now Nagious should start checking the down host by doing a query if the webserver on port 80 is up and running instead of pinging it.
As well as the server is no longer shown in the Nagio’s Down host list.

Computers Technology use, Internet, Mobile Phones and all kind of technical screen based equipment alters negatively the human brain

Tuesday, April 26th, 2011

Computers Internet and Technology evil terminator picture

According to latest scientific research conducted in Stanford University USA .

People who actively use computers and internet has been the object of the research in 2009.

Social Networks, Tablets Smartphones etc. provides more and more possibilities for us to access information.

Most of modern people today tend to loose approximately between 8 and 10 hours a day either using Internet, a PC, Word-excel, their mobile phone or some kind of other mobile gadget like let’s say IPAD.

Most of today’s technologic goods we use to make our lives easier are multitasking.
The brain itself is not adjusted to work in such a multi-tasking mode as a direct consequence of being in contact with this multi-tasking for a long periods of time it gets altered.
Suddenly it starts being multitasking, or in other words starts processing information in parallel.

As the amount of information is constantly increasing online and we’re in contact with more and more information and moreover the altered way of our brains which starts working in multi-tasking the brain-overflows or (information brain overlow) is starting being more and more occuring event.

The consequence of this complexity is starting to impact us seriously as we tend to get addicted to technology usage and day by day it seems that the amount of information our brains are able to process is decreasing.

Logically enough the long-term consequence of a an internet addiction or any kind of technology addiction, plus the tremendous amounts of information we do think over daily is starting to show up the negative consequences on our psyche and (soul)

The brain starts changing the way it gets information as it adapts itself to “not remember”, as the information to be processed daily is so much that it couldn’t really comprehend it.

A good example for multi-tasking which if not all most of the users on the Internet today use daily is one of terriblest things ever created facebook, in my of my previous articles I’ve blogged about why social networks are big evil read it here and it seems this new information about brain altering caused bhy multi-tasking is just another supporting reason on why it’s better not to use social networks like facebook and twitter.

The endless amount of information according to the Stanford University research has prooven that the endless amount of information is pernicioufor our (brains) minds and is in many ways similar to the excessive amount of sugar in the body.

The scientiests which conducted the research does recommend to heavy computer and tech users (like me), to self-control themselves and be on a tech-diet (e.g. not use technology completely for at least 1 or 2 days every week).

Another serious damage which was prooven according to Stanford’s scientiests research was that people’s brains who have a severe exposure to internet or phone usage tend to have very serious problems with contentration and are very easily distracted.
This in a long term surely leads to a chaotic way of living obviously.
Suddenly it seems technology to be slowly becoming even more deadly and destructive than drugs.

Many people would say this kind of research is not true, but I can confirm that for instance many of the proven facts are things I have experiences myself in my daily life, so I believe what the research has prooven is mostly true.

This research was just another one after a month before other scientiests has prooven that Mobile Phone use leads to alteration of the brain chemistry
Apart from all the said negative consequences of use of technology for human brain is the problem with technology today heavily used as a way to spy on personal privacy I wonder be glad to hear in the comments section for other people like me who have problems with concentration and have a very short time memory (I myself have serious problem with that one).