Posts Tagged ‘Cloud’

Export / Import PuTTY Tunnels SSH Sessions from one to another Windows machine howto

Thursday, January 31st, 2019

Putty-copy-ssh-tunnels-howto-from-one-to-another-windows-machine-3

As I've started on job position – Linux Architect in last November 2018 in Itelligence AG as a contractor (External Service) – a great German company who hires the best IT specialists out there and offers a flexible time schedules for emploees doing various very cool IT advanced operations and Strategic advancement of SAP's Cloud used Technology and Services improvements for SAP SE – SAP S4HANA and HEC (HANA Enterprise Cloud) and been given for work hardware a shiny Lenovo Thinkpad 500 Laptop with Windows 10 OS (SAP pre-installed), I needed to make some SSH Tunnels to machines to (Hop Station / Jump hosts) for that purpose, after some experimenting with MobaXterm Free (Personal Edition 11.0) and the presumable limitations of tunnels of the free client as well as my laziness to add the multiple ssh tunnels to different ssh / rdp / vnc etc. servers, finally I decided to just copy all the tunnels from a colleague who runs Putty and again use the good old Putty – old school Winblows SSH Terminal Client but just for creating the SSH tunnels and for rest use MobaXterm, just like in old times while still employe in Hewlett Packard. For that reason to copy the Tunnels from my dear German Colleague Henry Beck (A good herated collegue who works in field of Storage dealing with NetApps / filer Clusters QNap etc.).

Till that moment I had no idea how copying a saved SSH Tunnels definition is possible, I did a quick research just to find out this is done not with Putty Interface itself but, insetead through dumping Windows Putty Stored Registry records into a File, then transfer to the PC where Tunnels needs to be imported and then again (either double click the registry file) to load it, into registry or use Windows registry editor command line interface reg, here is how:
 

1. Export

 

Run cmd.exe (note below command) 

requires elevated Run as Administrator prompt:

Only sessions:

regedit /e "%USERPROFILE%\Desktop\putty-sessions.reg" HKEY_CURRENT_USER\Software\SimonTatham\PuTTY\Sessions

All settings:

regedit /e "%USERPROFILE%\Desktop\putty.reg" HKEY_CURRENT_USER\Software\SimonTatham

Powershell:

If you have powershell installed on machine, to dump

Only sessions:

 

reg export HKCU\Software\SimonTatham\PuTTY\Sessions ([Environment]::GetFolderPath("Desktop") + "\putty-sessions.reg")

All settings:

reg export HKCU\Software\SimonTatham ([Environment]::GetFolderPath("Desktop") + "\putty.reg")


2. Import

Double-click on the 

*.reg

 file and accept the import.

 

Alternative ways:

 

cmd.exe

require elevated command prompt:

regedit /i putty-sessions.reg regedit /i putty.reg

PowerShell:

reg import putty-sessions.reg reg import putty.reg



Below are some things to consider:

Note !do not replace 

SimonTatham

 with your username.

 

Note !: It will create a 

reg

 file on the Desktop of the current user (for a different location modify path)

 

Note !: It will not export your related (old system stored) SSH keys.

What to expect next?

Putty-Tunnels-SSH-Sessions-screenshot-Windows

The result is in Putty you will have the Tunnel sessions loadable when you launch (Portable or installed) Putty version.
Press Load button over the required saved Tunnels list and there you go under

 

Connection SSH -> Tunnels 

 

you will see all the copied tunnels.

Enjoy!

Preparing your Linux to work with the Cloud providers – Installing aws , gcloud, az, oc, cf CLI Cloud access command interfaces

Wednesday, October 10th, 2018

howto Install-Cloud-access-tools-for-google-aws-azure-openshift-cloud-foundryCloud_computing-explained-on-linux.svg

If you're a sysadmin / developer whose boss requires a migration of Stored Data, Database structures or Web Objects to Amazon Web Services / Google Clourd or you happen to be a DevOps Engineer you will certainly need to have installed as a minimumum amazon AWS and Google Clouds clients to do daily routines and script stuff in managing cloud resources without tampering to use the Web GUI interface.

Here is how to install the aws, gcloud, oc, az and cf next to your kubernetes client (kubectl) on your Linux Desktop.
 

1. Install Google Cloud  gcloud (to manage Google Cloud platform resources and developer workflow
 

google-cloud-logo

Here is few cmds to run to install  gcloud, gcloud alpha, gcloud beta, gsutil, and bq commands to manage your Google Cloud from CLI

a.) On Debian / Ubuntu / Mint or any other deb based distro

# Create environment variable for correct distribution
export CLOUD_SDK_REPO="cloud-sdk-$(lsb_release -c -s)"

 

# Add the Cloud SDK distribution URI as a package source
# echo "deb http://packages.cloud.google.com/apt $CLOUD_SDK_REPO main" | sudo tee -a /etc/apt/sources.list.d/google-cloud-sdk.list

 

# Import the Google Cloud Platform public key
$ sudo curl https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add –

 

# Update the package list and install the Cloud SDK
$ sudo apt-get update && sudo apt-get install google-cloud-sdk


b) On CentOS, RHEL, Fedora Linux and other rpm based ones
 

$ sudo tee -a /etc/yum.repos.d/google-cloud-sdk.repo << EOM
[google-cloud-sdk]
name=Google Cloud SDK
baseurl=https://packages.cloud.google.com/yum/repos/cloud-sdk-el7-x86_64
enabled=1
gpgcheck=1
repo_gpgcheck=1
gpgkey=https://packages.cloud.google.com/yum/doc/yum-key.gpg
       https://packages.cloud.google.com/yum/doc/rpm-package-key.gpg
EOM

# yum install google-cloud-sdk

 

That's all now the text client to talk to Google Cloud's API gcloud is installed under
/usr/bin/gcloud

Latest install instructions of Google Cloud SDK are here.


2. Install AWS Cloud command line interface tool for managing AWS (Amazon Web Services)
 

AmazonWebservices_Logo.svg

AWS client is dependent on Python PIP so before you proceed you will have to install python-pip deb package if on Debian / Ubuntu Linux use apt:

 

# apt-get install –yes python-pip

 

It is also possible to install newest version of PIP a tiny shell script provided by Amazon get-pip.py

 

# curl -O https://bootstrap.pypa.io/get-pip.py
# python get-pip.py –user

 

# pip install awscli –upgrade –user

 

3. Install Azure Cloud Console access CLI command interface
 

Microsoft_Azure_Cloud-Logo.svg

On Debian / Ubuntu or any other deb based distro:

# AZ_REPO=$(lsb_release -cs)
# echo "deb [arch=amd64] https://packages.microsoft.com/repos/azure-cli/ $AZ_REPO main" | \
$ sudo tee /etc/apt/sources.list.d/azure-cli.list

# curl -L https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add –
$ sudo apt-get update
$ sudo apt-get install apt-transport-https azure-cli

 

Finaly to check that Azure CLI is properly installed run simple login with:

 

$ az login

 


$ sudo rpm –import https://packages.microsoft.com/keys/microsoft.asc
$ sudo sh -c 'echo -e "[azure-cli]\nname=Azure CLI\nbaseurl=https://packages.microsoft.com/yumrepos/azure-cli\nenabled=1\ngpgcheck=1\ngpgkey=https://packages.microsoft.com/keys/microsoft.asc" > /etc/yum.repos.d/azure-cli.repo'
$ sudo yum install azure-cli

$ az login


For Latest install instructions check Amazon's documentation here

4. Install OpenShift OC CLI tool to access OpenShift Open Source Cloud

 

OpenShift-Redhat-cloud-platform

Even thought OpenShift has its original Redhat produced package binaries, if you're not on RPM distro it is probably
best to install using official latest version from openshift github repo.


As of time of writting this article this is done with:

 

# wget https://github.com/openshift/origin/releases/download/v1.5.1/openshift-origin-client-tools-v1.5.1-7b451fc-linux-64bit.tar.gz
tar –xvf openshift-origin-client-tools-v1.5.1-7b451fc-linux-64bit.tar.gz

 

# # mv openshift-origin-client-tools-v1.5.1-7b451fc-linux-64bit oc-tool

 

# cd oc-tool
# echo'export PATH=$HOME/oc-tool:$PATH' >> ~/.bashrc

 

To test openshift, try to login to OpenShift cloud:

 

$ oc login
Server [https://localhost:8443]: https://128.XX.XX.XX:8443


Latest install instructions on OC here

5. Install Cloud Foundry cf CLI Cloud access tool

cloud-foundry-cloud-logo

a) On Debian / Ubuntu Linux based distributions, do run:

 

$ wget -q -O – https://packages.cloudfoundry.org/debian/cli.cloudfoundry.org.key | sudo apt-key add –
$ echo "deb https://packages.cloudfoundry.org/debian stable main" | sudo tee /etc/apt/sources.list.d/cloudfoundry-cli.list
$ sudo apt-get update
$ sudo apt-get install cf-cli

 

b) On RHEL Enterprise Linux / CentOS and Fedoras

 

$ sudo wget -O /etc/yum.repos.d/cloudfoundry-cli.repo https://packages.cloudfoundry.org/fedora/cloudfoundry-cli.repo
$ sudo yum install cf-cli


For latest install insructions on cf cli check Cloud Foundry's install site

There plenty of other Cloud providers with the number exponentially growing and most have their own custom cli tools to access but as there use is not so common as the 5 ones mentioned below, I've omited 'em. If you're interested to know the complete list of Cloud Providers providing Cloud Services check here.

6. Install Ruby GEMs RHC tools collection

If you have to work with Redhat Cloud Storage / OpenShift you will perhaps want to install also (RHC) Redhat Collection Tools.

Assuming that the Linux system is running an up2date version of ruby programming language do run:

 

 

root@jeremiah:~# gem install rhc
Fetching: net-ssh-5.0.2.gem (100%)
Successfully installed net-ssh-5.0.2
Fetching: net-ssh-gateway-2.0.0.gem (100%)
Successfully installed net-ssh-gateway-2.0.0
Fetching: net-ssh-multi-1.2.1.gem (100%)
Successfully installed net-ssh-multi-1.2.1
Fetching: minitar-0.7.gem (100%)
The `minitar` executable is no longer bundled with `minitar`. If you are
expecting this executable, make sure you also install `minitar-cli`.
Successfully installed minitar-0.7
Fetching: hashie-3.6.0.gem (100%)
Successfully installed hashie-3.6.0
Fetching: powerbar-1.0.18.gem (100%)
Successfully installed powerbar-1.0.18
Fetching: minitar-cli-0.7.gem (100%)
Successfully installed minitar-cli-0.7
Fetching: archive-tar-minitar-0.6.1.gem (100%)
'archive-tar-minitar' has been deprecated; just install 'minitar'.
Successfully installed archive-tar-minitar-0.6.1
Fetching: highline-1.6.21.gem (100%)
Successfully installed highline-1.6.21
Fetching: commander-4.2.1.gem (100%)
Successfully installed commander-4.2.1
Fetching: httpclient-2.6.0.1.gem (100%)
Successfully installed httpclient-2.6.0.1
Fetching: open4-1.3.4.gem (100%)
Successfully installed open4-1.3.4
Fetching: rhc-1.38.7.gem (100%)
===========================================================================

 

If this is your first time installing the RHC tools, please run 'rhc setup'

===========================================================================
Successfully installed rhc-1.38.7
Parsing documentation for net-ssh-5.0.2
Installing ri documentation for net-ssh-5.0.2
Parsing documentation for net-ssh-gateway-2.0.0
Installing ri documentation for net-ssh-gateway-2.0.0
Parsing documentation for net-ssh-multi-1.2.1
Installing ri documentation for net-ssh-multi-1.2.1
Parsing documentation for minitar-0.7
Installing ri documentation for minitar-0.7
Parsing documentation for hashie-3.6.0
Installing ri documentation for hashie-3.6.0
Parsing documentation for powerbar-1.0.18
Installing ri documentation for powerbar-1.0.18
Parsing documentation for minitar-cli-0.7
Installing ri documentation for minitar-cli-0.7
Parsing documentation for archive-tar-minitar-0.6.1
Installing ri documentation for archive-tar-minitar-0.6.1
Parsing documentation for highline-1.6.21
Installing ri documentation for highline-1.6.21
Parsing documentation for commander-4.2.1
Installing ri documentation for commander-4.2.1
Parsing documentation for httpclient-2.6.0.1
Installing ri documentation for httpclient-2.6.0.1
Parsing documentation for open4-1.3.4
Installing ri documentation for open4-1.3.4
Parsing documentation for rhc-1.38.7
Installing ri documentation for rhc-1.38.7
Done installing documentation for net-ssh, net-ssh-gateway, net-ssh-multi, minitar, hashie, powerbar, minitar-cli, archive-tar-minitar, highline, commander, httpclient, open4, rhc after 10 seconds
13 gems installed
root@jeremiah:~#

To start with rhc next do:
 

rhc setup
rhc app create my-app diy-0.1


and play with it to install software create services on the Redhat cloud.

 

 

Closure

This are just of the few of the numerous tools available and I definitely understand there is much more to be said on the topic.
If you can remember other tools tor interesting cloud starting up tips about stuff to do on a fresh installed Linux PC to make life easier with Cloud / PaaS / SaaS / DevOps engineer please drop a comment.

Cloud Computing a possible threat to users privacy and system administrator employment

Monday, March 28th, 2011

Cloud Computing screenshot

If you’re employed into an IT branch an IT hobbyist or a tech, geek you should have certainly heard about the latest trend in Internet and Networking technologies the so called Cloud Computing

Most of the articles available in newspapers and online have seriously praised and put the hopes for a better future through cloud computing.
But is really the cloud computing as good as promised? I seriously doubt that.
Let’s think about it what is a cloud? It’s a cluster of computers which are connected to work as one.
No person can precisely say where exactly on the cluster cloud a stored information is located (even the administrator!)

The data stored on the cluster is a property of a few single organizations let’s say microsoft, amazon etc., so we as users no longer have a physical possession of our data (in case if we use the cloud).

On the other hand the number of system administrators that are needed for an administration of a huge cluster is dramatically decreased, the every day system administrator, who needs to check a few webservers and a mail server on daily basis, cache web data with a squid proxy cache or just restart a server will be no longer necessary.

Therefore about few million of peoples would have to loose their jobs, the people necessary to administrate a cluster will be probably no more than few thousands as the clouds are so high that no more than few clouds will exist on the net.

The idea behind the cluster is that we the users store retrieve our desktops and boot our operating system from the cluster.
Even loading a simple webpage will have to retrieve it’s data from the cluster.

Therefore it looks like in the future the cloud computing and the internet are about to become one and the same thing. The internet might become a single super cluster where all users would connect with their user ids and do have full access to the information inside.

Technologies like OpenID are trying to make the user identification uniform, I assume a similar uniform user identication will be used in the future in a super cloud where everybody, where entering inside will have access to his/her data and will have the option to access any other data online.

The desire of humans and business for transperancy would probably end up in one day, where people will want to share every single bit of information.
Even though it looks very cool for a sci-fi movie, it’s seriously scary!

Cloud computing expenses as they’re really high would be affordable only for a multi-national corporations like Google and Microsoft

Therefore small and middle IT business (network building, expanding, network and server system integration etc.) would gradually collapse and die.

This are only a few small tiny bit of concerns but in reality the problems that cloud computing might create are a way more severe.
We the people should think seriously and try to oppose cloud computing, while we still can! It might be even a good idea if a special legislation that is aming at limiting cloud computing can be integrated and used only inside the boundary of a prescribed limitations.

Institutions like the European Parliament should be more concerned about the issues which the use of cloud computing will bring, EU legislation should very soon be voted and bounding contracts stop clouds from expanding and taking over the middle size IT business.

Speed up your DNS resolve if your Internet Service Provider DNS servers fail or resolve slowly / Privacy concerns of public DNS services use

Wednesday, March 30th, 2011

In my experience with many network Internet Service Providers by so far I’ve encountered a lot of DNS oddities and therefore surfing (web) and mail slowness.

It’s sometimes very irritating especially in cases, when I use my internet over Wireless public or university wireless networks.
In principle many of the Wireless routers which distribute the internet especially in organizations are badly configured and the slowness with DNS resolvings is an absolute classic.
If you haven’t encountered that slowness in opening web pages when connected from your University’s canteen, whether it’s fill with people for the lunch break, then I should say you’re really lucky!

My personal experience with this bad configured devices DNS services has been quite negative and every now and then I use to set and use public DNS servers like OpenDNS and Google DNS

Very often when I connect to a wireless network with my notebook running Debian Linux and the internet is too slow in opening pages I automatically set the Google or OpenDNS servers as a default DNS IP resolving servers.

1. DNS IP addresses of Google Public DNS are:

8.8.8.7
and
8.8.8.8

2. OpenDNS Public DNS servers has the IP addresses of:

208.67.222.222
208.67.222.220

I do set up and use the upper public DNS services addresses via the commands:

3. Set and use Google Public DNS services on my Linux debian:~# cp -rpf /etc/resolv.conf /etc/resolv.conf.orig
debian:~# echo "nameserver 8.8.8.7n nameserver 8.8.8.8 n" > /etc/resolv.conf;

I first create backu pof my resolv.conf under the name resolv.conf.orig just to make sure I can revert back to my old DNSes if I need them at some point.

If you prefer to use the OpenDNS services for some let’s say privacy reasons, you do it in the same manner as in the above commands, you only change the IP addresses. 4. Configure and use the OpenDNS public DNS services

debian:~# cp -rpf /etc/resolv.conf /etc/resolv.conf.orig
debian:~# echo "nameserver 208.67.222.222n nameserver 208.67.222.220 n" > /etc/resolv.conf;

Of course using Public DNS services has it’s disadvantages over the domain resolving speed up advantage.
One major issue is that Public DNS services are running on a top of a cloud and if you have red my previous article Cloud Computing a possible threat to users privacy and system administrator employment you might be agaist the idea of using a services which are powered by cloud.

The other primary concern is related to your SECURITY and a PRIVACY by using Public DNS networks, you risk that your Public DNS provider might use some DNS spoof techniques to mislead you and resolve you common domain names which usually resolve to let’s say 1.1.1.1 to let’s say 1.5.5.10

Even though this kind of practices on a side of a public DNS provider is not a likely scenario the possible implications of Public DNS providers using DNS forgery to fool you about domain names locations is a very serious issue.

As public DNS providers does contain again the good old philosophy of cloud computing embedded in themselves and they strive to become some kind of a standard which people might vote to adopt and use, the future implications of a wide adoption of Public DNS servers might be a terrible thing on internet users privacy!!!

Just think about a future scenario where we users of the Internet are forced to use a number of public DNS servers in order to use the Internet!
Usually a very huge companies are possessing the Public DNS services and do pay for the tech equipment required for building up the cluster clouds which provide the DNS services and therefore, if in the short future public DNS becomes a fashion and (God forbid!) a standard which shifts up the regular ISP DNS servers to resolve domains to IPs then it will be terrible.

The corporations which does own the Public DNS service/s might have a direct control over filtering and censoling information posted on any website on the internet.
Even worser if the world decides to adopt public DNS services somewhere in the future this means that large corporations owning the open dns cluster or clusters will be able to check each and every resolving made by any user on the net.
If you think closely such an information possessed by a company is not the best thing we want.

So let me close up this article, I’m not a fan and an evangelist who preaches the use of Public DNS services. Right on Contrary I do honestly hate the idea behind public DNS.
Nevertheless apart from my personal opinion I’m a practical person and using the public DNS servers every now and then when this will accelerate my access to the internet is still an option I do enjoy.

Maybe it’s time for a free software project (a tor like), which will provide users with an OpenDNS alternative which will run on hobbyist computers around the globe (just like with tor).

What’s rather funny is that the loud name OpenDNS is a big lie in reality OpenDNS is not opened it’s a company owned closed source service 😉

Cloud Computing and linode alternatives to a normal dedicated machines

Wednesday, September 30th, 2009

One of the dedicated servers’s harddrive I maintain has failed.
This alone has created many, many problems! It’s serious stuff
since this server is the main mail server and primary DNS.
I still am unsure what is gonna happen. The dedicated server provider
responsible for the server Valueweb ‘s
services complete suck! The situation became even more complex with
the fact that valueweb has been bough by hostway.com
The old server which hardrive has failed needs replacement and is creating
serious problems within the company, however the management solution
required for giving a go to the operation of replacement was postponing
every time I tried to raise the question of the need for server replacement.
Well today the worst has happened. I was woken up by my gsm beepings just
to realize there are 20 smses notifying me that the server is inaccessible.
That happened around 5:30 in the morning! Shit what a miserable life, I only
slept for a couple of hours since I went to bed in 2 o’clock. I tried multiple
times to request reboots of the server all of which doesn’t bringed the system
back. That wasn’t fun really. Right after that I tried to raise a technical support request asking to check the server’s hardware. I never received any responce from valueweb on my
mail to inform me about the status of my request as usual!
Damned valueweb, this stupid bastards ARGHH!. At 9:30 our project manager ringed to ask what’s wrong with the mails.
Well what might it be, a hardware crash.
She had to call to America where the colocation resides, and ask them to move on with the hardware test, otherwise valueweb won’t move their finger.
Of course I was right the system hard drive was broken. And required replacement, after few calls from the office and my quick suggestion to change all
clients smtp and pop3 server to point to the backup server which thanks God at least worked, they did so (Thanks God).
I took the fast decision to come to the office in Varna since I had to talk with the tech support
in Florida and help out to some of the clients to correctly setup the new host in their Thunderbirds
and Outlooks. Yes here you can see how they use me as a tech support as well.
Many websites are down currently, I have spoken a couple of times with Florida’s tech support which were quite arrogant to me.
Never ever use valueweb or hostway as a dedicated server provider seriously this guys are bastards!
My idea was to replace the old server with a new one with a better hardware since it’s already broken, however the company CEO was not keen on the idea again trying to save some money!
You can understand what kind of a “good”CEO he is, since he tries to cut out costs from the hardware, that way guaranteeing again future problems.
He suggested to me to try to migrate to Amazon EC2 (Cloud Computing).
Something impossible since we have specific stuff running on the server
I had to research and assure myself that this Cloud Computing is a bulky shit ass just like I thought well it’s good for stuff like facebook, however in our situation it’s impossible. You can’t simply migrate all the stuff to cloud computing clapping with a magic stick.
In the meantime of researching on the topic I found something inreresting called check out linode here .
As a matter of fact I think it’s a better solution for plain linux install than Debian. Just to complete the post I have to say cloud computing is interesting but it doesn’t stand the same ground as a normal linux install.
A brief funny introduction of Cloud Computing can be seen in youtube’s cloud computing explained . What wikipedia says about Cloud Computing is a nice beginning introduction as well.
Cloud Computing is the latest hype out there is certanly future in Cloud Computing even though I don’t see it to be ready for the masses, yet.
I have to note that it seems the Linux powered Cloud computing options ran a Linux Xen, kernel. I haven’t tested Xen recently but the last time I and a colleague of mine did (2 years ago)
,it seemed back then will be completed to be 100% compliant with the normal linux kernels in the coming 5 years.
However the matter is quite interesting and I have to investigate further in it soon.
Now we decided to solve the situation by purchasing a newer dedicated server. The stupid asses from Hostaway need 24 hours to bring the machine up, you can imagine how insane is that.
In the best case all stuff on the old server would be up and running in the coming 48 hours or even more… ghah
END—–