How to become a Google Cloud Certified Professional Cloud Architect

In this post I'll be giving tips on how to prepare for Google Cloud Certified Professional Cloud Architect exam. This will also be helpful for people who are currently working on other cloud platforms like AWS or Azure and looking for broadening their skills to Google Cloud Platform (GCP).

As many of you who are following this blog knows that I am already working on AWS and Azure. About a couple of years back we got heavily into Kubernetes. Being a curious techie when I started digging further about Kubernetes, I found that it was initially designed by Google. One thing led to another and I ended up exploring more about google cloud. In parallel, we started getting traction on multi-cloud strategy and GCP is also considered a good option with many features which are helpful for both startups and big Enterprises.

So, I decided to get more knowledge and expertise on Google Cloud. When I compared AWS with GCP I felt that most of the technologies are similar but obviously with different naming convention and some technical setting differences. Thus, if you have worked on AWS it won't be very difficult to grasp the Google Cloud as well. But even if you don't have background in other clouds then also learning about google cloud is not very difficult, you just need to spend extra time on the basics.

If, you are from AWS background you can get a good comparison of AWS and GCP services here and that's what I did as a starting point.

Next I went through the Udemy course Google Cloud (GCP) 2020: Zero to Cloud Architect. This course covers the GCP services in detail starting from basics. So, it is useful even for someone who is starting from scratch.

Since our company is a partner of Google so, I supplemented my preparation by enrolling in the online training labs of QWIKLABS. These labs are really helpful in getting you good hands-on practice on the various GCP services.

AWS background folks will find that the GCP services are not very different but it still has some differences for e.g. in AWS a subnet is restricted to an AZ but in GCP the subnet can span to multiple AZs in a region. You have to keep these subtle differences in mind when designing in GCP.

If we talk specifically about the certification exam it mainly focuses on below topics:-

  • Managing and provisioning cloud infrastructure.
  • Designing and planning a cloud solution which is scalable, resilient and cost effective.
  • Security strengthening using IAM, Firewall, Security Groups etc.
  • Analyzing and Monitoring the application behavior using GCP Operations Suite.
  • Managing HA and DR.
The exam is in multiple choice questions format. You will also get 2-3 case studies and you have to select an answer which is most suitable considering the scenario mentioned in case study. 

You can choose to appear for an exam at the test center or go for an online proctored exam from home. Considering the Corona situation I appeared for an online proctored exam. You just have to follow the guidelines mentioned in the link I have shared above and with good internet connection it is pretty easy to appear for the exam from home.

Overall I found the exam to be very engaging covering wide ranging of topics.

If you have any queries regarding the exam preparation or GCP in general please post them in the comment section below. 

How to transfer files to and from EC2 instance

In our earlier post we showed how you can use Filezilla a GUI based solution to transfer files to an EC2 instance. But, in many companies installation of third party software like Filezilla is not allowed. 

So, in this post we will show you how you can transfer files to and from an EC2 linux instance using our old trustworthy friend SFTP

For those who don't know about sftp let us give you a gist of what it is.

SFTP is SSH File Transfer Protocol (also Secure File Transfer Protocol, or SFTP)  thus it works on same port 22 as ssh. It's secure in comparison to ftp which works on port 21 and nowadays blocked because of security reasons. sftp is generally pre-installed on most linux versions including Amazon Linux.

Also, if you compare it with SCP , which supports only file transfers, the SFTP allows you to perform a range of operations on remote files and resume file transfers.

If you want to know more about SFTP please look at this sftp wiki page 

Now let's see how we can use sftp to transfer files.

Pre-requisite for this are only two things both of which are pretty much standard requirement to access your EC2 linux instances.

1) ssh .pem key which you configured when you built the remote server where you want to connect.

2) Port 22 should be open to access the remote server. (In case you want to know if a port is open on a remote linux/unix server by CLI without using telnet check this post .)

Once you have checked that you have fulfilled the pre-requisites let's move to the next step.

Open your shell terminal, it can be GIT Bash installed on you local windows desktop or Linux shell terminal.

Inside the terminal you need to execute below command

sftp -o IdentityFile=identity_file

where identity_file is you .pem key .

Your actual command will look like

sftp -o IdentityFile=cloudvedas.pem ec2-user@

Let's check our directory in remote server


Remote working directory: /home/ec2-user

Let's go to /tmp

sftp> cd /tmp

Let's transfer a file from local machine to the remote server

sftp> PUT

Now if you want to transfer a file from remote server to local machine 

sftp> GET

Note: PUT and GET commands are case sensitive and will work in uppercase only.

If you forgot what was the home directory on your local machine you can check that from sftp prompt


Local working directory: /home/cloudvedas

If you want to change the directory in your local machine do this

sftp>lcd /tmp

Hope this simple tutorial is helpful for you. Do let us know in comments section if you have any queries or to share what methods you use to transfer file to EC2 instances.

AWS Subnet Calculator

This is a simple calculator to let you know how many IPv4 IPs you will get when you create a Subnet in AWS. 

AWS allows subnet mask only between /16 to /28 . Few IPs in each subnet are reserved for AWS internal usage . 

To calculate for e.g. in subnet subnet mask is 24 so, enter 24 below to get available IPs in this subnet . 

Enter Subnet Mask

Disclaimer: Please note that this is not an offical AWS calculator. Please visit AWS VPC for more details.

AWS Security and Compliance Crash Course

In this post we will provide you gist of the AWS security and compliance model.

Shared Security Model

AWS is responsible for securing the underlying infra. While customer is responsible for anything you put on the cloud or connect to the cloud.

Amazon is responsible for the security config of it's product that are considered managed services e.g. dynamoDB, Amazon RDS, Amazon redshift, Amazon workspaces, Amazon EMR.

IAAS :- Amazon EC2 and Amazon VPC are completely under customer's control and thus customer has to take steps to make them secure and compliant.

Storage decommissioning :-

AWS uses the technique detailed in DoD 5220.22-M and NIST 800-88 to destroy data as part of decommissioning process.

AWS Services to secure the cloud 

  • AWS Config :- Manage configuration history and change notifications to enable security.
  • AWS Service catalog :- Catalog allows you to centrally manage commonly-deployed IT services thus enabling users to deploy approved IT services in your organization.
  • AWS Guard Duty:- Offers threat detection and continuous monitoring and malicious behaviors in your AWS accounts.
  • AWS CloudHSM :- Protect your encryption keys with hardware security modules (HSM).
  • Server-side Encryption :- If you prefer S3 to manage encryption process for you.
  • AWS IAM  :- Secure access through IAM Users, Groups and Roles. IAM roles can be mapped to AD groups also.
  • Amazon Macie :- Use Machine learning to automatically discover and protect sensitive data.
  • AWS CloudTrail :- Records all API calls to your AWS account either programmatically or through console. 
AWS Artifact :- To get details of all the AWS compliance reports from third-party auditors.

Network security

  • You can connect to AWS access point via http or https using SSL.
  • AWS DirectConnect :- Private connectivity between yours and AWS datacenter.
  • For customer who require additional security amazon provides Amazon VPC which provide private subnet within AWS cloud and the ability to use an IPsec VPN(Virtual private network) device to provide an encrypted tunnel between the amazon vpc and your data center.
  • Amazon corporate network segregation:- Logically the amazon prod network is segregated from amazon corporate network by means of a complex set of network security/segregation devices.

Network Monitoring and Protection

Amazon protects from different type of attacks:-

DDoS:- A Denial of Service (DoS) attack is a malicious attempt to affect the availability of a targeted system, such as a website or application, to legitimate end users. Typically, attackers generate large volumes of packets or requests ultimately overwhelming the target system. In case of a Distributed Denial of Service (DDoS) attack, and the attacker uses multiple compromised or controlled sources to generate the attack.

Man in the Middle attacks(MITM) :- In cryptography and computer security, a man-in-the-middle attack (often abbreviated MitM, MiM attack, MitMA or the same using all capital letters) is an attack where the attacker secretly relays and possibly alters the communication between two parties who believe they are directly communicating with each other.

IP spoofing :- IP Spoofing is a technique used to gain unauthorized access to machines, whereby an attacker illicitly impersonate another machine by manipulating IP packets. IP Spoofing involves modifying the packet header with a forged (spoofed) source IP address, a checksum, and the order value.

Port Scanning :- Port scanner is an application designed to probe a server or host for open ports. This is often used by administrators to verify security policies of their networks and by attackers to identify services running on a host and exploit vulnerabilities.

AWS credentials types:-

  • password
  • Multi factor authentication (MFA)
  • AWS Microsoft AD 
  • IAM roles
  • access keys
  • key pairs
  • X.509 certificates:- X.509 are only used to sign SOAP-based requests . You can have AWS create a x.509 certificate and a private key that you can download, or you can upload your own certificate by using the security credentials page.

Automation :-

Amazon Inspector :- It's an automated security assessment service. It can be very helpful in finding vulnerabilities on OS and suggesting the patches.


AWS DynamoDB Cheat Sheet

DynamoDB is fast and flexible noSQL DB service for all application that need consistent single digit millisecond latency at any scale. It is a fully managed DB and support both document and key value data models.It is great for IoT, mobile/web gaming, and many other apps.

Quick facts of dynamodb
  • Stored on SSD storage
  • Spread across 3 geo distinct Ds.
  • Eventual consistent reads:- Consistency across all copies is usually reached within a sec. Repeating a read after short time should return the updated data.(Best Read perf)
  • Strongly consistent reads:- It returns a result that reflects all writes that received successful response prior to the read.

Items(Like row of data in a table)
Attributes(Like column of data in a table)

Here everything between brackets {} is Item and 1587, Alan etc. are attributes.

"ID" : 1587,
"Name" : "Alan"
"Phone": "555-5555"

Two types of primary keys available:-
Single Attribute(Think unique ID)
Partition Key (Hash Key) composed of one attribute.

Composite(Think unique ID and date Range)
Partition key and Sort key (hash & Range) composed of 2 attributes

Partition key
  • Dynamodb uses the partition key 's value as input to an internal hash function. The output from the hash function determines the partition(this is simply the physical location in which the data is stored)
  • No two items in a table can have the same partition key value.

Partition Key and Sort Key
  • Dynamodb uses the partition key 's value as input to an internal hash function. The output from the hash function determines the partition(this is simply the physical location in which the data is stored)
  • Two items in a table can have the same partition key , but they must have a different sort key.
  • All items with the same partition key are sorted together , in sorted order by sorted key value

Local secondary index
  • It has the same partition key but different sort key
  • Can only be created when creating a table. they cannot be removed or modified later.

Global secondary index:
  • It has different partition key and different sort key.
  • Can be created at table creation or added later.

DynamoDB streams
  • If a new item is added to the table, the stream captures an image of the entire item, including all of its attributes
  • If an item is updated, the stream captures the before and after image of any attributes that were modified in the item.
  • If an item is deleted from the table, the stream captures an image of an entire item before it was deleted.

A query operations find items in a table using only primary key attribute values. You must provide a partition attribute name and a distinct value to search for. You can optionally provide a sort key attribute name and value, and use a comparison operator to refine search results.
By default, a query returns all of the data attributes for the items with specified primary key(s) however you can use the ProjectionExpression parameter so that the query only returns some of the attributes, rather than all of them.

Query results are always sorted by the sort key. If the data type of the sort key is a number the results are returned in numeric order. Otherwise, the results are returned in order of ascii character code values. By default the sort order is ascending. To reverse the order set the ScanIndexForward parameter to false.

By default is eventually consistent but can be changed to strongly consistent.

A Scan operation examines every item in the table. By default, a scan returns all of the data attributes of every item however you can use the ProjectionExpression parameter so that the scan only returns some of the attributes, rather than all of them.

Hope you find this quick glance of DynamoDB useful. Do let us know in comments if you have any query or suggestion.

Today we also want to share with you a good news that our blog is now included by Feedspot in the list of AWS Top 10 blogs . We would like to thank you all for your help and support in achieving this.