Posts Tagged ‘Logically’

xorg on Toshiba Satellite L40 14B with Intel GM965 video hangs up after boot and the worst fix ever / How to reinstall Ubuntu by keeping the old personal data and programs

Wednesday, April 27th, 2011

black screen ubuntu troubles

I have updated Ubuntu version 9.04 (Jaunty) to 9.10 and followed the my previous post update ubuntu from 9.04 to Latest Ubuntu

I expected that a step by step upgrade from a release to release will work like a charm and though it does on many notebooks it doesn't on Toshiba Satellite L40

The update itself went fine, whether I used the update-manager -d and followed the above pointed tutorial, however after a system restart the PC failed to boot the X server properly, a completely blank screen with blinking cursor appeared and that was all.

I restarted the system into the 2.6.35-28-generic kernel rescue-mode recovery kernel in order to be able to enter into physical console.

Logically the first thing I did is to check /var/log/messages and /var/log/Xorg.0.log but I couldn't find nothing unusual or wrong there.

I suspected something might be wrong with /etc/X11/xorg.conf so I deleted it:

ubuntu:~# rm -f /etc/X11/xorg.conf

and attempted to re-create the xorg.conf X configuration with command:

ubuntu:~# dpkg-reconfigure xserver-xorg

This command was reported to be the usual way to reconfigure the X server settings from console, but in my case (for unknown reasons) it did nothing.

Next the command which was able to re-generate the xorg.conf file was:

ubuntu:~# X -configure

The command generates a xorg.conf sample file in /root/xorg.conf.* so I used the conf to put it in /etc/X11/xorg.conf X's default location and restarted in hope that this would fix the non-booting issue.

Very sadly again the black screen of death appeared on the notebook toshiba screen.
I further thought of completely wipe out the xorg.conf in hope that at least it might boot without the conf file but this worked out neither.

I attempted to run the Xserver with a xorg.conf configured to work with vesa as it's well known vesa X server driver is supposed to work on 99% of the video cards, as almost all of them nowdays are compatible with the vesa standard, but guess what in my case vesa worked not!

The only version of X I can boot in was the failsafe X screen mode which is available through the grub's boot menu recovery mode.

Further on I decided to try few xorg.conf which I found online and were reported to work fine with Intel GM965 internal video , and yes this was also unsucessful.

Some of my other futile attempts were: to re-install the xorg server with apt-get, reinstall the xserver-xorg-video-intel driver e.g.:

ubuntu:~# apt-get install --reinstall xserver-xorg xserver-xorg-video-intel

As nothing worked out I was completely pissed off and decided to take an alternative approach which will take a lot of time but at least will probably be succesful, I decided to completely re-install the Ubuntu from a CD after backing up the /home directory and making a list of available packages on the system, so I can further easily run a tiny bash one-liner script to install all the packages which were previously existing on the laptop before the re-install:

Here is how I did it:

First I archived the /home directory:

ubuntu:/# tar -czvf home.tar.gz home/
....

For 12GB of data with some few thousands of files archiving it took about 40 minutes.

The tar spit archive became like 9GB and I hence used sftp to upload it to a remote FTP server as I was missing a flash drive or an external HDD where I can place the just archived data.

Uploading with sftp can be achieved with a command similar to:

sftp user@yourhost.com
Password:
Connected to yourhost.com.
sftp> put home.tar.gz

As a next step to backup in a file the list of all current installed packages, before I can further proceed to boot-up with the Ubuntu Maverich 10.10 CD and prooceed with the fresh install I used command:

for i in $(dpkg -l| awk '{ print $2 }'); do
echo $i; done >> my_current_ubuntu_packages.txt

Once again I used sftp as in above example to upload my_current_update_packages.txt file to my FTP host.

After backing up all the stuff necessery, I restarted the system and booted from the CD-rom with Ubuntu.
The Ubuntu installation as usual is more than a piece of cake and even if you don't have a brain you can succeed with it, so I wouldn't comment on it 😉

Right after the installation I used the sftp client once again to fetch the home.tar.gz and my_current_ubuntu_packages.txt

I placed the home.tar.gz in /home/ and untarred it inside the fresh /home dir:

ubuntu:/home# tar -zxvf home.tar.gz

Eventually the old home directory was located in /home/home so thereon I used Midnight Commander ( the good old mc text file explorer and manager ) to restore the important user files to their respective places.

As a last step I used the my_current_ubuntu_packages.txt in combination with a tiny shell script to install all the listed packages inside the file with command:

ubuntu:~# for i in $(cat my_current_ubuntu_packagespackages.txt); do
apt-get install --yes $i; sleep 1;
done

You will have to stay in front of the computer and manually answer a ncurses interface questions concerning some packages configuration and to be honest this is really annoying and time consuming.

Summing up the overall time I spend with this stupid Toshiba Satellite L40 with the shitty Intel GM965 was 4 days, where each day I tried numerous ways to fix up the X and did my best to get through the blank screen xserver non-bootable issue, without a complete re-install of the old Ubuntu system.
This is a lesson for me that if I stumble such a shitty issues I will straight proceed to the re-install option and not loose my time with non-sense fixes which would never work.

Hope the article might be helpful to somebody else who experience some problems with Linux similar to mine.

After all at least the Ubuntu Maverick 10.10 is really good looking in general from a design perspective.
What really striked me was the placement of the close, minimize and maximize window buttons , it seems in newer Ubuntus the ubuntu guys decided to place the buttons on the left, here is a screenshot:

Left button positioning of navigation Buttons in Ubuntu 10.10

I believe the solution I explain, though very radical and slow is a solution that would always work and hence worthy 😉
Let me hear from you if the article was helpful.

How to install GNOME server on Ubuntu 10.04.2 LTS Lucid

Wednesday, April 20th, 2011

After some upgrades of Ubuntu from 9.10 to Ubuntu 10.04.2, I faced problems during apt-get upgrade && apt-get dist-upgrade

I had to fix it up with apt-get upgrade -f , however the “fix” which was targetting a fix up to the apt-get dist-upgrade removed about 260 packages, among which were the grub boot loader , xorg-server and even gnome

As in order to fix the package mishap the apt-get update -f was my only possible solution I went by and confirmed that I would like to wipe out all the packages.

Logically afterwards it was required that I install my missing xorg-server and gnome in order to make the Ubuntu desktop work again.

Here is how:

ubuntu:~# apt-get update && apt-get xorg-server ubuntu-desktop

The ubuntu-desktop is a meta package which installs the GNOME environment.

Computers Technology use, Internet, Mobile Phones and all kind of technical screen based equipment alters negatively the human brain

Tuesday, April 26th, 2011

Computers Internet and Technology evil terminator picture

According to latest scientific research conducted in Stanford University USA .

People who actively use computers and internet has been the object of the research in 2009.

Social Networks, Tablets Smartphones etc. provides more and more possibilities for us to access information.

Most of modern people today tend to loose approximately between 8 and 10 hours a day either using Internet, a PC, Word-excel, their mobile phone or some kind of other mobile gadget like let’s say IPAD.

Most of today’s technologic goods we use to make our lives easier are multitasking.
The brain itself is not adjusted to work in such a multi-tasking mode as a direct consequence of being in contact with this multi-tasking for a long periods of time it gets altered.
Suddenly it starts being multitasking, or in other words starts processing information in parallel.

As the amount of information is constantly increasing online and we’re in contact with more and more information and moreover the altered way of our brains which starts working in multi-tasking the brain-overflows or (information brain overlow) is starting being more and more occuring event.

The consequence of this complexity is starting to impact us seriously as we tend to get addicted to technology usage and day by day it seems that the amount of information our brains are able to process is decreasing.

Logically enough the long-term consequence of a an internet addiction or any kind of technology addiction, plus the tremendous amounts of information we do think over daily is starting to show up the negative consequences on our psyche and (soul)

The brain starts changing the way it gets information as it adapts itself to “not remember”, as the information to be processed daily is so much that it couldn’t really comprehend it.

A good example for multi-tasking which if not all most of the users on the Internet today use daily is one of terriblest things ever created facebook, in my of my previous articles I’ve blogged about why social networks are big evil read it here and it seems this new information about brain altering caused bhy multi-tasking is just another supporting reason on why it’s better not to use social networks like facebook and twitter.

The endless amount of information according to the Stanford University research has prooven that the endless amount of information is pernicioufor our (brains) minds and is in many ways similar to the excessive amount of sugar in the body.

The scientiests which conducted the research does recommend to heavy computer and tech users (like me), to self-control themselves and be on a tech-diet (e.g. not use technology completely for at least 1 or 2 days every week).

Another serious damage which was prooven according to Stanford’s scientiests research was that people’s brains who have a severe exposure to internet or phone usage tend to have very serious problems with contentration and are very easily distracted.
This in a long term surely leads to a chaotic way of living obviously.
Suddenly it seems technology to be slowly becoming even more deadly and destructive than drugs.

Many people would say this kind of research is not true, but I can confirm that for instance many of the proven facts are things I have experiences myself in my daily life, so I believe what the research has prooven is mostly true.

This research was just another one after a month before other scientiests has prooven that Mobile Phone use leads to alteration of the brain chemistry
Apart from all the said negative consequences of use of technology for human brain is the problem with technology today heavily used as a way to spy on personal privacy I wonder be glad to hear in the comments section for other people like me who have problems with concentration and have a very short time memory (I myself have serious problem with that one).