Posts Tagged ‘option’

How to add visitor/visitors counter in Joomla based website using the VINAORA joomla plugin

Wednesday, December 29th, 2010

One of the websites based on Joomla, that I’m managing, had to have the option to show a Visitors Counter on the page bottom.

I did some research online to check for some Joomla plugins that are capable of aiming my Joomla installation with the Visitor Counter feature.

There are a couple of plugins available out there that are able to provide a Joomla installation with a Web Counter, however I’ll just mention the few which I have tested myself.
The ones I’ve tested myself are;

1. VCNT
2. Count your Visitors
3. Cool Hits Counter
and finally the one that I really liked the most:

4. VINAORA

All the prior 3 aforementioned VCNT, Count your Visitors and Cool Hits Counter, are actually working right after installation. I haven’t had the time to test them thoroughfully but from a first look they appear to be counting the visitors.

The Count your visitors – joomla plugin’s download main page was in German with no option to revert to English, so that wasn’t nice.

Though the plugin seems to work fine after just downloaded and installed and configured from

Extensions -> Module Manager

The configuration options for Count your visitors are quite obscure but if you’re a looking for a really simple way to count your Joomla based website visitors it might be a good choice. The plugin is based on the initial Joomla module called Statistics.

By the way I’ve read some reports online that actually Joomla 1.5 also includes some kind of minimal integrated web counter embedded in itself throughout the Joomla Statistics module .
However I personally couldn’t really follow the methods describe to take advantage of the Joomla install integrated Statistics module, however if somebody has already succeeded in using the Statistics default joomla module I’ll be glad, if he shares with me.

I’ve found the VCNT visitor statistics plugin a handy one, but this module had this major problem that the VCNT 1.5 text which was appearing as a heading before the statistics was hard to wipe out of my web page, so eventually I got pissed of and thought and tested the Cool Hits Counter.

The Cool Hits Counter is actually a simplistic module which presents a counter about visitors on the web site in numeric digit numbers.
These module uses the integrated Joomla 1.5 module mod_stats the only difference is the support for numeric digit numbers.

As I’ve said none of the above modules wasn’t flexible enough and therefore wasn’t what I was looking for, thus I decided to use my installed VINAORA Joomla web(users) counter plugin.

The VINAORA is actually quite straight foward to configure, right after installation to start with the configuration I had to navigate to:

Extension -> Module Manager -> Vinaora Visitors Counter -> Module Parameters

Since I wanted just a simple counter without any external statistics I personally prefered using Vinaora’s counter with the optionsDisplay Mode: Simple as well as the Zero-Statistics turned on to (Yes)

Some of the other options I found to be best matching my desires for the Visitors Counter was:

Show Title: No and Position: Left

Now I had this shiny visitors statistics in my Joomla installation but there was the annoying Link in my Visitors Counter appearing on the page, thus in order to remove the default link Visitors Counter which was pointing to the VINAORA’s web page, I had to edit the file:
modules/mod_vvisit_counter/tmpl/default.php located in my Joomla Document Root directory.

First I’ve edited the code on line 21 where I’ve removed out the Vinaora string since I didn’t wanted any reference to Vinaora to occur in my Joomla HTML code.

In line 162 in the code:

$html .= ….

By removing it I’ve completed scraped out the annoying references to VINAORA and their link to http://vinaora.com website and thus made the Visitors counter a look a bit more professional.

Now the counter is working with a graphical numeric digits and everything is just fine with my Web counter Joomla counter thanks to VINAORA 🙂
 

Fix Null error in WordPress comment reply with wordpress-threaded-comments plugin enabled

Friday, April 6th, 2012

I'm running WordPress for already 3 years or so now. Since some very long time. The first wordpress install, I can hardly remember but it something like wordpress 2.5 or wordpress 2.4

Since quite a long time my wordpress blog is powered by a number of plugins, which I regularly update, whenever new plugins pops up …
I haven't noticed most of the time problems during major WordPress platform updates or the update of the installed extensions. However, today while I tried to reply back to one of my blog comments, I've been shocked that, I couldn't.
Pointing at the the Comment Reply box and typing inside was impossible and a null message was stayed filled in the form:

To catch what was causing this weird misbehaving with the reply comments functionality, I grepped through my /var/www/blog/wp-content/plugins/* for the movecfm(null,0,1,null):

# cd /var/www/blog/wp-content/plugins
# grep -rli 'movecfm(null,0,1,null)' */*.php
wordpress-thread-comment/wp-thread-comment.php

I've taken the string movecfm(null,0,1,null) from the browser page source in in my Firefox by pressing – Ctrl+U).

Once I knew of the problem, I first tried commenting the occurances of the null fields in wp-thread-comment.php, but as there, were other troubles in commenting this and I was lazy to read the whole code, checked online if some other fellows experienced the same shitty null void javascript error and already someone pointed at a solution. In the few minutes search I was unable to find anyone who reported for this bug, but what I found is some user threads on wordpress.org mentioning since WordPress 2.7+ the wordpress-threaded-comments is obsolete and the functionality provided by the plugin is already provided by default in newer WPinstalls.

Hence in order to enable the threaded comments WordPress (embedded) reply functionality from within the wp-admin panel used:

Settings -> Discussions -> Enable Threaded (nested) comments (Tick)

Enable Nested Comments WordPress default wp comments enable reply functionality screenshot

You see there is also an option to define how many nested comments subcomments, can be placed per comment, the default was 5, but I thought 5 is a bit low so increased it to 10 comments reply possible per comment.

Finally, to prevent the default threaded comments to interfere with the WordPress Threaded Comments plugin, disabled the plugin through menus:

Plugins -> Active -> WordPress Thread Comments (Deactivate)

This solved the weird javascript null "bug" caused by wordpress-threaded-comments once and for all.
Hopefully onwards, my blog readers will not have issues with threaded Reply Comments.

How to delete million of files on busy Linux servers (Work out Argument list too long)

Tuesday, March 20th, 2012

How to Delete million or many thousands of files in the same directory on GNU / Linux and FreeBSD

If you try to delete more than 131072 of files on Linux with rm -f *, where the files are all stored in the same directory, you will get an error:

/bin/rm: Argument list too long.

I've earlier blogged on deleting multiple files on Linux and FreeBSD and this is not my first time facing this error.
Anyways, as time passed, I've found few other new ways to delete large multitudes of files from a server.

In this article, I will explain shortly few approaches to delete few million of obsolete files to clean some space on your server.
Here are 3 methods to use to clean your tons of junk files.

1. Using Linux find command to wipe out millions of files

a.) Finding and deleting files using find's -exec switch:

# find . -type f -exec rm -fv {} \;

This method works fine but it has 1 downside, file deletion is too slow as for each found file external rm command is invoked.

For half a million of files or more, using this method will take "long". However from a server hard disk stressing point of view it is not so bad as, the files deletion is not putting too much strain on the server hard disk.
b.) Finding and deleting big number of files with find's -delete argument:

Luckily, there is a better way to delete the files, by using find's command embedded -delete argument:

# find . -type f -print -delete

c.) Deleting and printing out deleted files with find's -print arg

If you would like to output on your terminal, what files find is deleting in "real time" add -print:

# find . -type f -print -delete

To prevent your server hard disk from being stressed and hence save your self from server normal operation "outages", it is good to combine find command with ionice, e.g.:

# ionice -c 3 find . -type f -print -delete

Just note, that ionice cannot guarantee find's opeartions will not affect severely hard disk i/o requests. On  heavily busy servers with high amounts of disk i/o writes still applying the ionice will not prevent the server from being hanged! Be sure to always keep an eye on the server, while deleting the files nomatter with or without ionice. if throughout find execution, the server gets lagged in serving its ordinary client requests or whatever, stop the execution of the cmd immediately by killing it from another ssh session or tty (if physically on the server).

2. Using a simple bash loop with rm command to delete "tons" of files

An alternative way is to use a bash loop, to print each of the files in the directory and issue /bin/rm on each of the loop elements (files) like so:

for i in *; do
rm -f $i;
done

If you'd like to print what you will be deleting add an echo to the loop:

# for i in $(echo *); do \
echo "Deleting : $i"; rm -f $i; \

The bash loop, worked like a charm in my case so I really warmly recommend this method, whenever you need to delete more than 500 000+ files in a directory.

3. Deleting multiple files with perl

Deleting multiple files with perl is not a bad idea at all.
Here is a perl one liner, to delete all files contained within a directory:

# perl -e 'for(<*>){((stat)[9]<(unlink))}'

If you prefer to use more human readable perl script to delete a multitide of files use delete_multple_files_in_dir_perl.pl

Using perl interpreter to delete thousand of files is quick, really, really quick.
I did not benchmark it on the server, how quick exactly is it, but I guess the delete rate should be similar to find command. Its possible even in some cases the perl loop is  quicker …

4. Using PHP script to delete a multiple files

Using a short php script to delete files file by file in a loop similar to above bash script is another option.
To do deletion  with PHP, use this little PHP script:

<?php
$dir = "/path/to/dir/with/files";
$dh = opendir( $dir);
$i = 0;
while (($file = readdir($dh)) !== false) {
$file = "$dir/$file";
if (is_file( $file)) {
unlink( $file);
if (!(++$i % 1000)) {
echo "$i files removed\n";
}
}
}
?>

As you see the script reads the $dir defined directory and loops through it, opening file by file and doing a delete for each of its loop elements.
You should already know PHP is slow, so this method is only useful if you have to delete many thousands of files on a shared hosting server with no (ssh) shell access.

This php script is taken from Steve Kamerman's blog . I would like also to express my big gratitude to Steve for writting such a wonderful post. His post actually become  inspiration for this article to become reality.

You can also download the php delete million of files script sample here

To use it rename delete_millioon_of_files_in_a_dir.php.txt to delete_millioon_of_files_in_a_dir.php and run it through a browser .

Note that you might need to run it multiple times, cause many shared hosting servers are configured to exit a php script which keeps running for too long.
Alternatively the script can be run through shell with PHP cli:

php -l delete_millioon_of_files_in_a_dir.php.txt.

5. So What is the "best" way to delete million of files on Linux?

In order to find out which method is quicker in terms of execution time I did a home brew benchmarking on my thinkpad notebook.

a) Creating 509072 of sample files.

Again, I used bash loop to create many thousands of files in order to benchmark.
I didn't wanted to put this load on a productive server and hence I used my own notebook to conduct the benchmarks. As my notebook is not a server the benchmarks might be partially incorrect, however I believe still .they're pretty good indicator on which deletion method would be better.

hipo@noah:~$ mkdir /tmp/test
hipo@noah:~$ cd /tmp/test;
hiponoah:/tmp/test$ for i in $(seq 1 509072); do echo aaaa >> $i.txt; done

I had to wait few minutes until I have at hand 509072  of files created. Each of the files as you can read is containing the sample "aaaa" string.

b) Calculating the number of files in the directory

Once the command was completed to make sure all the 509072 were existing, I used a find + wc cmd to calculate the directory contained number of files:

hipo@noah:/tmp/test$ find . -maxdepth 1 -type f |wc -l
509072

real 0m1.886s
user 0m0.440s
sys 0m1.332s

Its intesrsting, using an ls command to calculate the files is less efficient than using find:

hipo@noah:/tmp/test$ time ls -1 |wc -l
509072

real 0m3.355s
user 0m2.696s
sys 0m0.528s

c) benchmarking the different file deleting methods with time

– Testing delete speed of find

hipo@noah:/tmp/test$ time find . -maxdepth 1 -type f -delete
real 15m40.853s
user 0m0.908s
sys 0m22.357s

You see, using find to delete the files is not either too slow nor light quick.

– How fast is perl loop in multitude file deletion ?

hipo@noah:/tmp/test$ time perl -e 'for(<*>){((stat)[9]<(unlink))}'real 6m24.669suser 0m2.980ssys 0m22.673s

Deleting my sample 509072 took 6 mins and 24 secs. This is about 3 times faster than find! GO-GO perl 🙂
As you can see from the results, perl is a great and time saving, way to delete 500 000 files.

– The approximate speed deletion rate of of for + rm bash loop

hipo@noah:/tmp/test$ time for i in *; do rm -f $i; done

real 206m15.081s
user 2m38.954s
sys 195m38.182s

You see the execution took 195m en 38 secs = 3 HOURS and 43 MINUTES!!!! This is extremely slow ! But works like a charm as the running of deletion didn't impacted my normal laptop browsing. While the script was running I was mostly browsing through few not so heavy (non flash) websites and doing some other stuff in gnome-terminal) 🙂

As you can imagine running a bash loop is a bit CPU intensive, but puts less stress on the hard disk read/write operations. Therefore its clear using it is always a good practice when deletion of many files on a dedi servers is required.

b) my production server file deleting experience

On a production server I only tested two of all the listed methods to delete my files. The production server, where I tested is running Debian GNU / Linux Squeeze 6.0.3. There I had a task to delete few million of files.
The tested methods tried on the server were:

– The find . type -f -delete method.

– for i in *; do rm -f $i; done

The results from using find -delete method was quite sad, as the server almost hanged under the heavy hard disk load the command produced.

With the for script all went smoothly. The files were deleted for a long long time (like few hours), but while it was running, the server continued with no interruptions..

While the bash loop was running, the server load avarage kept at steady 4
Taking my experience in mind, If you're running a production, server and you're still wondering which delete method to use to wipe some multitude of files, I would recommend you go  the bash for loop + /bin/rm way. Yes, it is extremely slow, expect it run for some half an hour or so but puts not too much extra load on the server..

Using the PHP script will probably be slow and inefficient, if compared to both find and the a bash loop.. I didn't give it a try yet, but suppose it will be either equal in time or at least few times slower than bash.

If you have tried the php script and you have some observations, please drop some comment to tell me how it performs.

To sum it up;

Even though there are "hacks" to clean up some messy parsing directory full of few million of junk files, having such a directory should never exist on the first place.

Frankly, keeping millions of files within the same directory is very stupid idea.
Doing so will have a severe negative impact on a directory listing performance of your filesystem in the long term.

If you know better (more efficient) ways to delete a multitude of files in a dir please share in comments.

Improve default picture viewing on Slackware Linux with XFCE as Desktop environment

Saturday, March 17th, 2012

Default XFce picture viewer on Slackware Linux is GIMP (GNU Image Manipulation Program). Though GIMP is great for picture editting, it is rather strange why Patrick Volkerding compiled XFCE to use GIMP as a default picture viewer? The downsides of GIMP being default picture viewing program for Slackware's XFCE are the same like Xubuntu's XFCE risterroro, you can't switch easily pictures back and forward with some keyboard keys (left, right arrow keys, backspace or space etc.). Besides that another disadvantage of using GIMP are;
a) picture opening time in GIMP loading is significantly higher if compared to a simple picture viewer program like Gnome's default, eye of the gnomeeog.

b) GIMP is more CPU intensive and puts high load on each picture opening

A default Slackware install comes with two good picture viewing programs substitute for GIMP:
 

  • Gwenview

    Gwenview on Slackware Linux picture screenshot XFCE

  •  
  • Geeqie
  • Geeqie Slackware Linux Screenshot XFCE

    Both of the programs support picture changing, so if you open a picture you can switch to the other ones in the same directory as the first opened one.
    I personally liked more Gwenview because it has more intutive picture switching controls. With it you can switch with keyboard keys space and backspace

    To change GIMP's default PNG, JPEG opening I had with mouse right button over a pic and in properties change, Open With: program.

    XFCE4 Slackware Linux picture file properties window

    If you're curious about the picture on on all screenshots, this is Church – Saint George (situated in the city center of Dobrich, Bulgaria).
    St. Georgi / St. George Church is built in 1842 and is the oldest Orthodox Church in Dobrich.
    In the Crimean War (1853-1856) the church was burned down and was restored to its present form in 1864.

    gpicview is another cool picture viewing program, I like. Unfortunately on Slackware, there is no prebuild package and the only option is either to convert it with alien from deb package or to download source and compile as usual with ./configure && make && make install .
    Downloading and compiling from source went just fine on Slackware Linux 13.37gpicview has more modern looking interface, than gwenview and geeqie. and is great for people who want to be in pace with desktop fashion 🙂

Yesterday, Today, Tomorrow

Sunday, December 2nd, 2007

Yesterday I spend a lot of time outside with Lily. We went to the fountain we watched film at home. The film was called”Wild Hogs” it was supposed to be a fun commedy (only supposed to be). This week is going to be a taugh one.We have to present a project at Marketing Research.

I have to write a 600 words resume about International Enterprice,also we have to make a presentation in Culture. Today in the morning i was on a Liturgy again. God’s grace ishere ! The week passed without serious server issues (Thanks God). Today I checked some logs of one of the serversand I observed oddities there. I checked the crontab and I realized it’s because of a crontab. The dumped databaseis a HUGE one 2.6G (bzipped).

I asked in irc.freenode.net #mysql, and the guys there pointed me to a similar issuewhich was supposed to be an MySQL bug when dumping large database. Since the dumping databases were of a type MyISAMI ofcourse could have used mysqlhotcopy.

But in the end the solution to the problem was removing “–opt” option fromthe backup opts of mysqldump and passing “–skip-opt” to it (I suspect this would slow the dumping process a lot).But I don’t care it is much better (a slow dumping), than hanging the whole Webserver and interrupting the site’s visibilityover the Internet.

Btw I started playing Quake 2, it’s cool but a little annoying there are too many tunnels and veryoften after I kill most of the bad guys I spend a lot of time searching for keys and stuff .. :).END—–

Here I’m :)

Saturday, August 16th, 2008

Haven’t blogged for some time so I’m going to write down few lines. Today I passed succesfully the Cisco CCNA2 Final Exam and the CCNA2 Voucher exam thanks to God :). The Voucher exam grants everybody who passed succesfully a discount of the price of the CCNA Certification exam, whi ch is pretty cool. To be honest I cheated at the 2 tests otherwise I won’t have passed but I’m really busy this days and I’m a bit of tired so this was the only option. I also made my CCNA2 (9,10,11) tests for which I used the cisco answers page :). CCNA2 The Final Exam and the Voucher wasn’t included in the blogspot cisco answers so I googled around but thanksfully I found the answers on a website in the net. This days I had a lot of fun and saw a lot of old friends (some of them studying others working in Sofia). It’s the summers holiday period in which tons of friends who are in other cities are back in Dobrich so I was able to spend nice time with a lot of them. This are ppl like: hellpain, nomen, mariana (one of my ex-girlfriend). At Wednesday I and Nomen went to the beach at Balchik, we had great fun there. We met together in front of the place we call “The Young House” at 7:30, at 8:30 we were already travelling to Balchik and in 9:35 we were on the beach. We stayed there until 12:30. Vlado and Mitko tried to learn me to be able to relax on my on back while in the water I almost did it but still I need some more tries until I’m able to make it the right way. At 13:00 we were back in Dobrich and we went to an open restaurant “Seasons”. I ate vegetarenian pizza and Vlado ate a meal called “English Breakfast”. “English Breakfast”. Right after that at 14:00 I went to the regional Red Cross building, because I had to get a course on “First Aid”, it’s required to have that course passed until you’re able to apply for a driving license after you complete both the theoretical exam and the practice exam. My driving practice exam would be at Wednesday at 9:00 to be honest I’m not still driving well and my driving course is almost over. My only hope for passing it in the Lord, only if he support me and guide me during the exam I’ll be able to pass. The same night I went out with Shanar and we later met hellpain and Alex. We had nice discussions in the city park until somewhere around 11:30. When we were going back home I met Mariana and I proposed her to have a walk together she accepted and we negotiated to meet around 30 minutes later. We spend 4 hours with her talking about stuff and drinking beer. I should say she is one of the only girls I’m able to speak for hours and still feel good and confortable. The center of our talks were mainly God the Bible and in particular my belief in Christ Jesus as Lord and Saviour. At Friday morning I had driving courses early at the morning from 8:00 o’clock, later I went to Varna with my father because I had to pick up one bag I forgot the last time when I was in Varna, Thanks God we didn’t crashed. On our way to Varna there was a very dangerous situation in which the chances to crash was pretty big but thanksfully to God’s protection and kindness we didn’t. Later when we came back in Dobrich I went to the police office to check if international passport is ready. Thanksfully it was and now I have the “red passport” home and ready :). I called to Mariana to great her because it was the great christian feast, we at the Orthodox Church believe that at that day the Eartly Mother of our Lord Jesus has resurrected in the 3rd day and ascended to heaven just like our Lord! We call that celebrity “The Maryam’s day”. Everybody who is named after Mary’s name is also celebrating this great feast. We decided to meet at night time at 11:30 and have walk. Like I said earlier I really enjoy Mariana’s company. Unfortunately an hour later we met Bino so we wasn’t able to talk much about stuff with Mariana. Bino is pretty cool guy but sometimes his company is pretty annoying :). Later I went home and after a minute of prayer I went to bed. On August 22 I’m traveling to the Netherlands to continue my studies at the HRQM stream, sometimes I feel a sort of preliminary homesickness but I believe this decision is right and it’s God’s souvereign plan for my life. Well as Bugs Bunny (my favour cartoon character) says that’s all folks!END—–

The day wasn’t so bad at all

Monday, February 5th, 2007

Yesterday I was on a birthday of a one girl Krisi she is 18 already. It was a standard teen party. Awful place, I drunk 3 beers and smoked a lot of cigaretes in general we discussed with some ppl Does the Lord exist things. Is evolution real is it possible at all things like this. There was one boy who was keen on Free Software GPL etc. I realized I’m a real psycho after I started to convince him Windows and M$ products are much better than free software :]. After 3 hours speaking like a 15 years m$ user. When walking back for home. I asked my self what the heck? What I did I’m real psycho ;]. Zuio and I broke 1 flowerpot when doing POGO on a Hipodil song. I eat some strange salad with hands ( I pretended to be a Bangla person ) :]. Nomen drunk a lot and got angry at a guy and proposed to start a fight:]. In 5:30 a.m. we were already at home. Our mood got fucked up a little when going back to home. In the morning Papi wake me up and suggested to go to a coffee, I aggreed as usual but I did misunderstood about the place where we decided to see, and waited on other place after that I realized I did a mistake and he meant we have to see each other in the Winter Theather not the Summer One (silly me). After that normal day I eat. I did some server descriptions, played mame in Nomen (I went his home). I was able to start gxmame to work in fullscreen in the end (my integrated video card is doing alot of problems about the games) but I found out the xmame -ef 1 option. That one rules.END—–

Alternative way to kill X in Linux with Alt + Printscreen + K

Sunday, October 31st, 2010

I’ve recently realized that the CTRL + ALT + BACKSPACE keyboard combination is no longer working in Debian unstable.

This good old well known keyboard combination to restart X is not working with my xorg 7.5+8 under my Gnome 2.30 desktop
However thanksfully there is another combination to kill the X server if for instance if your Gnome desktop hangs.

If that happens simply press ALT + PRINTSCREEN + K this will kill your X and then reload the (Gnome Display manager) gdm.

Another suggestion I’ve red in the forums of a way to enable back CTRL+ALT+BACKSPACE is to put in either .bashrc or .xinitrc the following command

setxkbmap -option terminate:ctrl_alt_bksp

BTW It’s better that the above command is placed in ~/.xinitrc.

I’ve also red on some forums that in newer releases of Ubuntu. The CTRL+ALT+BACKSPACE can be enabled using a specific command, e.g. with:

dontzap -disable
 

How to list enabled VirtualHosts in Apache on GNU / Linux and FreeBSD

Thursday, December 8th, 2011

How Apache process vhost requests picture, how to list Apache virtualhosts on Linux and FreeBSD

I decided to start this post with this picture I found on onlamp.com article called “Simplify Your Life with Apache VirtualHosts .I put it here because I thing it illustrates quite well Apache’s webserver internal processes. The picture gives also a good clue when Virtual Hosts gets loaded, anways I’ll go back to the main topic of this article, hoping the above picture gives some more insight on how Apache works.;
Here is how to list all the enabled virtualhosts in Apache on Debian GNU / Linux serving pages:

server:~# /usr/sbin/ apache2ctl -S
VirtualHost configuration:
wildcard NameVirtualHosts and _default_ servers:
*:* is a NameVirtualHost
default server exampleserver1.com (/etc/apache2/sites-enabled/000-default:2)
port * namevhost exampleserver2.com (/etc/apache2/sites-enabled/000-default
port * namevhost exampleserver3.com (/etc/apache2/sites-enabled/exampleserver3.com:1)
port * namevhost exampleserver4.com (/etc/apache2/sites-enabled/exampleserver4.com:1)
...
Syntax OK

The line *:* is a NameVirtualHost, means the Apache VirtualHosts module will be able to use Virtualhosts listening on any IP address (configured on the host), on any port configured for the respective Virtualhost to listen on.

The next output line:
port * namevhost exampleserver2.com (/etc/apache2/sites-enabled/000-default Shows requests to the domain on any port will be accepted (port *) by the webserver as well as indicates the <VirtualHost> in the file /etc/apache2/sites-enabled/000-default:2 is defined on line 2 (e.g. :2).

To see the same all enabled VirtualHosts on FreeBSD the command to be issued is:

freebsd# pcfreak# /usr/local/sbin/httpd -S VirtualHost configuration:
wildcard NameVirtualHosts and _default_ servers:
*:80 is a NameVirtualHost
default server www.pc-freak.net (/usr/local/etc/apache2/httpd.conf:1218)
port 80 namevhost www.pc-freak.net (/usr/local/etc/apache2/httpd.conf:1218)
port 80 namevhost pcfreak.afraid.org (/usr/local/etc/apache2/httpd.conf:1353)
...
Syntax OK

On Fedora and the other Redhat Linux distributions, the apache2ctl -S should be displaying the enabled Virtualhosts.

One might wonder, what might be the reason for someone to want to check the VirtualHosts which are loaded by the Apache server, since this could be also checked if one reviews Apache / Apache2’s config file. Well the main advantage is that checking directly into the file might sometimes take more time, especially if the file contains thousands of similar named virtual host domains. Another time using the -S option is better would be if some enabled VirtualHost in a config file seems to not be accessible. Checking directly if Apache has properly loaded the VirtualHost directives ensures, there is no problem with loading the VirtualHost. Another scenario is if there are multiple Apache config files / installs located on the system and you’re unsure which one to check for the exact list of Virtual domains loaded.