Solved: How to login to AWS EC2 Linux Instance

In this post we will discuss how you can login to your AWS EC2 linux instance using Putty.
Pre-requisites :-
Once you are done with the pre-requisites let’s move ahead.
Convert .pem key to .ppk
  • First we will convert the .pem key to .ppk key.
  • Click on PuttyGen you downloaded.
  • Click on “Load”. Browse and select your private key with .pem extension.
Now click on “Save private key” .
It will ask if you want to add passphrase. It’s like additional password when you login. If you want you can enter passphrase in “Key passphrase”.
For this exercise i just clicked on “Yes” .

  • Save key with the name you like. Check that the new key file now have .ppk extension.
Using the  Key for Login
Now we will use the .ppk key we just created to login to our ec2 instance.
  • Open Putty that we downloaded earlier.
  • In the left Pane click on Session.
In hostname enter you server details like user name and IP.
If you are using Amazon Linux Image the default user is ec2-user.  So entry will be like ec2-user@33.44.55.66  and Port 22.

  • In the left navigation pane  click on “Connection” and expand it.
Next expand “SSH” and click on “Auth” (refer image below).

In the right pane click on Browse and select the .ppk key we created earlier.
  • Now in the left navigation pane click on “Session” again. In the right pane in the “Saved sessions”, name the session as “test” or whatever you like and click save. This will save your session so that you don’t have to do this activity again.
  • Finally select the session you created and click “Open”. If all is configured correctly you will now be logged in to you EC2 instance.
Note:- If your ssh session gets timed out after being idle for few minutes check this post on how to set putty keep alive time.

Solved: Using filezilla for transferring files to AWS EC2 instances

Filezilla is a great free opensource tool for securely transferring files to and from Unix and Linux servers. Also, it’s secure in comparison  to doing ftp to a linux server.
If not already downloaded you can download filezilla from here .
By default you can enter user id and password  of destination server to connect to it on port 22.
But in case of AWS EC2 instances you don’t get password instead you use the ssh key to connect to it.
We will need .ppk key for this activity. If you have .pem key and not .ppk key refer this post to convert pem key to ppk key  .
Once you have the ppk key let’s move on.
Create new site in Filezilla
  • Click on File > Site Manager
  • In the left pane click on “New Site” and give a name to the site e.g. devserver
  • In the right window enter details as:-
Host:- Mention IP of server
Protocol:- SFTP
Logon type:- Key file
User:- ec2-user (or mention user name for which you uploaded ssh key)
Key file:- Browse and select  the ppk key file
  • Finally click OK.
Try connecting to the server. It should work now.

If you are not allowed to install a third party software please check this post to use SFTP for file transfer to EC2.

Solved: "Network error: Software caused connection abort"

Some time you may have noticed that your putty session is getting disconnected with error “Network error: Software caused connection abort”
This can happen because of time out setting on server or sometime due to firewall. To resolve this issue you will have to set a keep alive time for the session.
After you set a keep alive time putty will send a packet after the specified seconds to keep the session live.
Generally you can set it to 240 seconds i.e. 4 minutes. But at times you may have to keep it low. Like when I connect from my home laptop to my AWS EC2 instance I’ve to keep it at 2 secs.
To set it:
  • Open Putty.
  • Load the session for which you are facing timeout issue.
  • Click on Connection in the left pane .
Here we have set Seconds between keepalives to 2 . (refer image below)



  • Finally click on “Session” in left pane and save the session.
If you already have other saved sessions in putty you will have to repeat above steps for each of the saved session if needed.

Comparing AWS, Azure and Redhat exams

In last couple of years I’ve given certification exams of multiple cloud providers. While AWS and Azure exams are more theoretical and based on multiple choice questions. Redhat Openstack is practical lab exam. In this article I’ll be discussing the pros and cons of the different exam patterns.
AWS
Passed AWS CSA – Associate and Professional
Pros:-
  • You can refer to any question any time you want during the exam.
  • Exam tests you on a wide range of topic.
  • Even if you have made mistakes in the beginning you can recover by reviewing the questions later.
  • No datasheet type questions like “How much RAM does C3.Xlarge offers?”
  • You get 1 Year free tier which is great to learn about AWS.
Cons:-
  • Exams are expensive in comparison to Microsoft Azure (at least in India)
  • In many question it just tests your reading speed.
  • No version, so you don’t know if you should answer as per recent announcement or old method available.
Azure
Passed Architecting Microsoft Azure Solutions.
Pros:-
  • Azure exams are not very expensive in comparison to AWS or Red Hat.
  • Exam tests you on a wide range of topic.
  • Even if you have made mistakes in the beginning you can recover by solving correctly other answers in later sections.
Cons:-
  • No version, so you don’t know if you should answer as per recent announcement or old method available.
  • It’s a race against time.
  • The most insane thing I found in the exam is that with each case study you get 5 to 6 questions. But, for about 2 to 3 questions in that case study you can’t refer back to case study. I don’t understand why Microsoft expects you to remember the whole 2 page case study.
  • 30 days free tier is too less to know about azure.
Update:- Azure is now(Oct-17) offering 12 months free trial.
Redhat Openstack
Passed  RHCSA in Redhat Openstack
Pros:-
  • The difficulty level of the exam is medium.
  • You generally get questions on tasks which you will be doing in real life.
  • You get only 15-20 questions (tasks) and even if someone knows those questions beforehand he will have to do the tasks practically to pass the exam. So even if anyone has dumps, they are useless.
  • It has versioning so you know you have to answer as per Redhat Openstack version 6 or 8.
  • Can play with Redhat Openstack by installing it in your desktop or laptop. Good Learning!
Cons:-
  • Exams are expensive in comparison to Microsoft Azure.
  • If you have made a mistake at the beginning or in the middle of exam chances are you will mess up the whole exam or waste lot of time correcting it.
  • If your machine doesn’t work properly you may lose time. But generally examiners take care of this.
In the end, I’d like to say that professional exams should not be like your college entrance exams where they mostly test your reading speeds and cramming abilities. But it’s OK for them as the undergrad and grad have limited practical experience.
Professional exams should be more practical, that makes you sure that if a person has cleared the exam, he definitely know how to do that stuff in real life.
If you want to know how to prepare for these exams refer my post for AWS , Azure and Redhat Openstack .

Azure Crash Course - WebJobs

  1. Azure WebJobs is similar to AWS Lambda a serverless technology.
You can run programs or scripts in WebJobs in your Azure App Service web app in three ways:
On demand – You trigger it when you need.
Continuously – It will keep on running in background always.
Schedule – You can schedule it to run on specific date and time.
Following file types are accepted:-
  • .cmd, .bat, .exe (using windows cmd)
  • .ps1 (using powershell)
  • .sh (using bash)
  • .php (using php)
  • .py (using python)
  • .js (using node)
  • .jar (using java)
Interestingly Azure WebJobs supports shell scripts which is missing in AWS Lambda.
The WebJobs SDK does not yet support .NET Core.
Typical use case for Azure WebJobs:-
  • Image processing or other CPU-intensive work.
  • Queue processing.
  • RSS aggregation.
  • File maintenance, such as aggregating or cleaning up log files.
  • Other long-running tasks that you want to run in a background thread, such as sending emails.
  • Any tasks that you want to run on a schedule, such as performing a back-up operation every night.
Do Note
  • Web apps in Free mode can time out after 20 minutes if there are no requests to the scm (deployment) site and the web app’s portal is not open in Azure. Requests to the actual site will not reset this.
  • Code for a continuous job needs to be written to run in an endless loop.
  • Continuous jobs run continuously only when the web app is up. Check what’s a Web App .
  • Basic and Standard modes offer the Always On feature which, when enabled, prevents web apps from becoming idle.
  • You can only debug continuously running WebJobs. Debugging scheduled or on-demand WebJobs is not supported.
This article was written to give you quick snapshot of WebJobs.  You can follow this Azure Doc to check how to quickly deploy apps with WebJobs.
Azure Crash Course series is created to give you quick snapshot of Azure services. You can check other services in this series over here .