Steps to create docker containers in your laptop


In this post we will see how to create a docker container of ubuntu on your windows laptop.

Pre-requisites:-

I am using “Docker for Windows” software to run dockers on my Windows 10 laptop. You can get “Docker for Windows” by clicking on this link .
If you have Windows 7 download Docker Toolbox for Windows with Virtualbox.

Ubuntu docker creation
Once you are done with docker installation let’s move ahead.
  • In the windows command prompt or in “Docker Quickstart Terminal” execute below command. By default it will pull the latest image of ubuntu container available in repository.
C:\CloudVedas>docker run ubuntu
  • If you need  specific version of Ubuntu you can mention the version name in command. Like below we are pulling Ubuntu 14.04 version. You can check all the available versions here .
C:\CloudVedas> docker run ubuntu:14.04
Unable to find image 'ubuntu:14.04' locally
14.04: Pulling from library/ubuntu
2e6e20c8e2e6: Pulling fs layer
30bb187ac3fc: Pulling fs layer


  • Now let’s see the image we have downloaded .
C:\CloudVedas> docker image ls
REPOSITORY      TAG                 IMAGE ID            CREATED             SIZE
ubuntu                   14.04               6e4f1fe62ff1        4 months ago        197MB


  • Let’s create a container with that image using the image id. Here we are using -t and -d option so that the container keeps on running in detached mode and we can login to it.
C:\CloudVedas>docker run -t -d 6e4f1fe62ff1
a8dee68d78026adb830edb04391af00ec8b7e1033e711fc640a1489ca54adc0a



  • List the running containers using “docker container ls”
C:\CloudVedas>docker container ls
CONTAINER ID       IMAGE        COMMAND      CREATED         STATUS     PORTS       NAMES
a8dee68d7802        6e4f1fe62ff1   "/bin/bash"      About a minute ago   Up About a minute eager_johnson



We can see our container is created 5 minutes ago and is up. You can also identify the container using the container id. Note that the container ID is same as the first 12 digits of the string we got when we executed docker run in last step.
  • Let’s get inside our container and check it.
C:\CloudVedas>docker exec -it  a8dee68d7802 /bin/bash


Once you are inside the Ubuntu container you can explore it. Let’s check the OS version.
root@a8dee68d7802:/# more /etc/os-release NAME="Ubuntu" VERSION="14.04.6 LTS, Trusty Tahr" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 14.04.6 LTS" VERSION_ID="14.04" HOME_URL="http://www.ubuntu.com/" SUPPORT_URL="http://help.ubuntu.com/" BUG_REPORT_URL="http://bugs.launchpad.net/ubuntu/" root@a8dee68d7802:/#
  • If you don’t see your container up, it may have stopped automatically. To check all the containers which are/were running, execute command
C:\CloudVedas>docker ps -a

You can also create your own customized container image which you can use to deploy more containers in your environment. Refer this post on how to create a container image.  If instead of Docker hub you are using AWS ECR, check the post on how to push/pull image to/from AWS ECR

If you want to try next level on docker you can user dockerfile or docker-compose files to install your application. Like in our other post we cover how you can install wordpress in docker.

Hope this post helped you. Do let us know if you have any query or you get stuck in any installation step. 

How to become a Google Cloud Certified Professional Cloud Architect


In this post I'll be giving tips on how to prepare for Google Cloud Certified Professional Cloud Architect exam. This will also be helpful for people who are currently working on other cloud platforms like AWS or Azure and looking for broadening their skills to Google Cloud Platform (GCP).

As many of you who are following this blog knows that I am already working on AWS and Azure. About a couple of years back we got heavily into Kubernetes. Being a curious techie when I started digging further about Kubernetes, I found that it was initially designed by Google. One thing led to another and I ended up exploring more about google cloud. In parallel, we started getting traction on multi-cloud strategy and GCP is also considered a good option with many features which are helpful for both startups and big Enterprises.

So, I decided to get more knowledge and expertise on Google Cloud. When I compared AWS with GCP I felt that most of the technologies are similar but obviously with different naming convention and some technical setting differences. Thus, if you have worked on AWS it won't be very difficult to grasp the Google Cloud as well. But even if you don't have background in other clouds then also learning about google cloud is not very difficult, you just need to spend extra time on the basics.

If, you are from AWS background you can get a good comparison of AWS and GCP services here and that's what I did as a starting point.

Next I went through the Udemy course Google Cloud (GCP) 2020: Zero to Cloud Architect. This course covers the GCP services in detail starting from basics. So, it is useful even for someone who is starting from scratch.

Since our company is a partner of Google so, I supplemented my preparation by enrolling in the online training labs of QWIKLABS. These labs are really helpful in getting you good hands-on practice on the various GCP services.

AWS background folks will find that the GCP services are not very different but it still has some differences for e.g. in AWS a subnet is restricted to an AZ but in GCP the subnet can span to multiple AZs in a region. You have to keep these subtle differences in mind when designing in GCP.

If we talk specifically about the certification exam it mainly focuses on below topics:-

  • Managing and provisioning cloud infrastructure.
  • Designing and planning a cloud solution which is scalable, resilient and cost effective.
  • Security strengthening using IAM, Firewall, Security Groups etc.
  • Analyzing and Monitoring the application behavior using GCP Operations Suite.
  • Managing HA and DR.
The exam is in multiple choice questions format. You will also get 2-3 case studies and you have to select an answer which is most suitable considering the scenario mentioned in case study. 

You can choose to appear for an exam at the test center or go for an online proctored exam from home. Considering the Corona situation I appeared for an online proctored exam. You just have to follow the guidelines mentioned in the link I have shared above and with good internet connection it is pretty easy to appear for the exam from home.

Overall I found the exam to be very engaging covering wide ranging of topics.

If you have any queries regarding the exam preparation or GCP in general please post them in the comment section below. 

How to transfer files to and from EC2 instance



In our earlier post we showed how you can use Filezilla a GUI based solution to transfer files to an EC2 instance. But, in many companies installation of third party software like Filezilla is not allowed. 

So, in this post we will show you how you can transfer files to and from an EC2 linux instance using our old trustworthy friend SFTP

For those who don't know about sftp let us give you a gist of what it is.

SFTP is SSH File Transfer Protocol (also Secure File Transfer Protocol, or SFTP)  thus it works on same port 22 as ssh. It's secure in comparison to ftp which works on port 21 and nowadays blocked because of security reasons. sftp is generally pre-installed on most linux versions including Amazon Linux.

Also, if you compare it with SCP , which supports only file transfers, the SFTP allows you to perform a range of operations on remote files and resume file transfers.

If you want to know more about SFTP please look at this sftp wiki page 

Now let's see how we can use sftp to transfer files.

Pre-requisite for this are only two things both of which are pretty much standard requirement to access your EC2 linux instances.

1) ssh .pem key which you configured when you built the remote server where you want to connect.

2) Port 22 should be open to access the remote server. (In case you want to know if a port is open on a remote linux/unix server by CLI without using telnet check this post .)

Once you have checked that you have fulfilled the pre-requisites let's move to the next step.

Open your shell terminal, it can be GIT Bash installed on you local windows desktop or Linux shell terminal.

Inside the terminal you need to execute below command

sftp -o IdentityFile=identity_file ec2-user@10.xxx.xxx.xxx

where identity_file is you .pem key .

Your actual command will look like

sftp -o IdentityFile=cloudvedas.pem ec2-user@192.168.0.10

Let's check our directory in remote server

sftp>pwd

Remote working directory: /home/ec2-user

Let's go to /tmp

sftp> cd /tmp

Let's transfer a file from local machine to the remote server

sftp> PUT test123-file.sh

Now if you want to transfer a file from remote server to local machine 

sftp> GET remote123-file.sh

Note: PUT and GET commands are case sensitive and will work in uppercase only.

If you forgot what was the home directory on your local machine you can check that from sftp prompt

sftp>lpwd

Local working directory: /home/cloudvedas

If you want to change the directory in your local machine do this

sftp>lcd /tmp

Hope this simple tutorial is helpful for you. Do let us know in comments section if you have any queries or to share what methods you use to transfer file to EC2 instances.

AWS Subnet Calculator


This is a simple calculator to let you know how many IPv4 IPs you will get when you create a Subnet in AWS. 

AWS allows subnet mask only between /16 to /28 . Few IPs in each subnet are reserved for AWS internal usage . 

To calculate for e.g. in subnet 10.0.0.0/24 subnet mask is 24 so, enter 24 below to get available IPs in this subnet . 

Enter Subnet Mask
















Disclaimer: Please note that this is not an offical AWS calculator. Please visit AWS VPC for more details.

AWS Security and Compliance Crash Course

In this post we will provide you gist of the AWS security and compliance model.

Shared Security Model


AWS is responsible for securing the underlying infra. While customer is responsible for anything you put on the cloud or connect to the cloud.



Amazon is responsible for the security config of it's product that are considered managed services e.g. dynamoDB, Amazon RDS, Amazon redshift, Amazon workspaces, Amazon EMR.

IAAS :- Amazon EC2 and Amazon VPC are completely under customer's control and thus customer has to take steps to make them secure and compliant.

Storage decommissioning :-

AWS uses the technique detailed in DoD 5220.22-M and NIST 800-88 to destroy data as part of decommissioning process.


AWS Services to secure the cloud 

  • AWS Config :- Manage configuration history and change notifications to enable security.
  • AWS Service catalog :- Catalog allows you to centrally manage commonly-deployed IT services thus enabling users to deploy approved IT services in your organization.
  • AWS Guard Duty:- Offers threat detection and continuous monitoring and malicious behaviors in your AWS accounts.
  • AWS CloudHSM :- Protect your encryption keys with hardware security modules (HSM).
  • Server-side Encryption :- If you prefer S3 to manage encryption process for you.
  • AWS IAM  :- Secure access through IAM Users, Groups and Roles. IAM roles can be mapped to AD groups also.
  • Amazon Macie :- Use Machine learning to automatically discover and protect sensitive data.
  • AWS CloudTrail :- Records all API calls to your AWS account either programmatically or through console. 
 
AWS Artifact :- To get details of all the AWS compliance reports from third-party auditors.

Network security

  • You can connect to AWS access point via http or https using SSL.
  • AWS DirectConnect :- Private connectivity between yours and AWS datacenter.
  • For customer who require additional security amazon provides Amazon VPC which provide private subnet within AWS cloud and the ability to use an IPsec VPN(Virtual private network) device to provide an encrypted tunnel between the amazon vpc and your data center.
  • Amazon corporate network segregation:- Logically the amazon prod network is segregated from amazon corporate network by means of a complex set of network security/segregation devices.

Network Monitoring and Protection

Amazon protects from different type of attacks:-


DDoS:- A Denial of Service (DoS) attack is a malicious attempt to affect the availability of a targeted system, such as a website or application, to legitimate end users. Typically, attackers generate large volumes of packets or requests ultimately overwhelming the target system. In case of a Distributed Denial of Service (DDoS) attack, and the attacker uses multiple compromised or controlled sources to generate the attack.

Man in the Middle attacks(MITM) :- In cryptography and computer security, a man-in-the-middle attack (often abbreviated MitM, MiM attack, MitMA or the same using all capital letters) is an attack where the attacker secretly relays and possibly alters the communication between two parties who believe they are directly communicating with each other.

IP spoofing :- IP Spoofing is a technique used to gain unauthorized access to machines, whereby an attacker illicitly impersonate another machine by manipulating IP packets. IP Spoofing involves modifying the packet header with a forged (spoofed) source IP address, a checksum, and the order value.

Port Scanning :- Port scanner is an application designed to probe a server or host for open ports. This is often used by administrators to verify security policies of their networks and by attackers to identify services running on a host and exploit vulnerabilities.


AWS credentials types:-

  • password
  • Multi factor authentication (MFA)
  • AWS Microsoft AD 
  • IAM roles
  • access keys
  • key pairs
  • X.509 certificates:- X.509 are only used to sign SOAP-based requests . You can have AWS create a x.509 certificate and a private key that you can download, or you can upload your own certificate by using the security credentials page.

Automation :-

Amazon Inspector :- It's an automated security assessment service. It can be very helpful in finding vulnerabilities on OS and suggesting the patches.


Source: https://aws.amazon.com/