Wednesday, May 23, 2018

How to setup AWS S3 Remote Backend in Terraform

Share this Post

Overview

These days Terraform is the industry’s go-to tool for Infrastructure automation. Terraform allows you to write infrastructure as a code, which you can manage via source control and one of many benefits is that you can keep track of the changes of your infrastructure (which is a nightmare for any organization).
How Terraform keep track of the changes in your environment? it creates a terraform.tfstate file on a local filesystem. TF state file is simply a small database of the state of your environment. Whenever you run terraform plan, apply or destroy commands it reads the current state from terraform.tfstatefile and applies changes to it.

Problem

The problem arrives when you are working in a team. Since terraform.tfstate file is created on your local file system the other developer does not have visibility to it. When any other developer executes the same scripts terraform will create a new terraform.tfstate file which would be different from the current state.
Common solutions to this issue could be to store terraform.tfstate in a source control, that might work in a small team where one person is working at a time or where you have the option of having a different account for each developer. One issue with that is also the .tfstate file could have some sensitive information (such as RDS passwords) that you don’t want to upload to source control systems like GitHub.
Another solution is to use Terraform enterprise solution which comes with all bells and whistles.
In this post, I will show you how you can solve this problem using Remote backends, how can you setup Terraform to use S3 buckets to keep the state of your environment.

Remote Backends

There are many types of remote backends you can use with Terraform but in this post, we will cover the popular solution of using S3 buckets.
Following are some benefits of using remote backends
  1. Team Development — when working in a team, remote backends can keep the state of infrastructure at a centralized location
  2. Sensitive Information — with remote backends your sensitive information would not be stored on local disk
  3. Remote Operations — Infrastructure build could be a time-consuming task, some remote backends supports remote execution of the tasks. You can then turn off your computer and your operation will still complete. Paired with remote state storage and locking above, this also helps in team environments.
I hope that gives you enough info on remote backends, let’s dive into the solution Click the Image or visit https://medium.com/@zeebaig/terraform-using-aws-s3-remote-backend-52ea914fcbac




About DataNext

DataNext Solutions is US based system integrator, specialized in Cloud, Big Data, DevOps technologies. As a registered AWS partner, our services comprise of any Cloud Migration, Cost optimization, Integration, Security and Managed Services. Click here and Book Free assessment call with our experts today or visit our website www.datanextsolutions.com for more info.


Thursday, May 17, 2018

Key Changes Under GDPR

Share this Post


We all probably now familiar with the term GDPR, if not then you have probably 100s of unread emails in your inbox on updated Privacy Policy from every account you signed up for. The term GDPR stands for General Data Protection Regulation. As per Wikipedia:
The General Data Protection Regulation (GDPR) (EU) 2016/679 is a regulation in EU law on data protection and privacy for all individuals within the European Union. It also addresses the export of personal data outside the EU. The GDPR aims primarily to give control to citizens and residents over their personal data and to simplify the regulatory environment for international business by unifying the regulation within the EU.
What covers under GDPR what is not? IMO one can write the whole book about it but I will try to explain in simple terms

What is GDPR?

The personal details such as IDs, birthdays, addresses, account numbers, health records and other sensitive information are everywhere and in the hands of partners and vendors, we work with every day.
Because all this information is out there, we as individuals have to trust the parties to handle that information securely and when they don’t handle it, the data breaches could result in inconvenience, cost time and money and hurt the reputation.
European Union leads the way with GDPR regulation to keep information safe and protects the rights of a real people, customers, partners around the world.
Following are some key highlights

Individual Rights

Under personal privacy section, individuals have right to
  1. Data Transparency
  2. Full access to data
  3. Rectification of data
  4. Erase personal data
  5. Opt-out or object from processing at any time

Organizations Responsibilities

Organizations will need to:
  1. Protect all personal data of any kind
  2. Determine the purpose and methods that will be used for processing the data, organizations would be responsible for any errors involving third parties as well
  3. Get individuals consents for data processing
  4. Organizations must be completely transparent about the individual’s data on how and why they are using it.
  5. Notify individuals and authorities for any data breaches

Your Responsibility

As a working professional, how to identify if you are compliant with GDPR or not? you need to ask the following questions from yourself
  1. Do I have permission to use this data?
  2. How can I protect this data?
  3. What to do if data is at risk?
By asking these questions you will fulfill your responsibility and compliance with GDPR.
Hope this post helps you to understand GDPR fundamentals

About DataNext

DataNext Solutions is US based system integrator, specialized in Cloud, Big Data, DevOps technologies. As a registered AWS partner, our services comprise of any Cloud Migration, Cost optimization, Integration, Security and Managed Services. Click here and Book Free assessment call with our experts today or visit our website www.datanextsolutions.com for more info.

Friday, May 11, 2018

How to Collect Custom Metrics from AWS EC2 Instances

Share this Post

Overview


Monitoring is a critical part of any cloud infrastructure and it is important to maintain reliability, availability, and performance of your AWS cloud applications. There are 2 main types of monitoring you can do on AWS EC2 Instances as follows

Basic Monitoring for Amazon EC2 instances: Seven pre-selected metrics at five-minute frequency and three status check metrics at one-minute frequency, for no additional charge.

Detailed Monitoring for Amazon EC2 instances: All metrics available to Basic Monitoring at one-minute frequency, for an additional charge. Instances with Detailed Monitoring enabled allows data aggregation by Amazon EC2 AMI ID and instance type.


While limited monitoring is enabled by default, it does not cover the memory utilization or disk I/O of the instances, for these use cases you need to enable custom monitoring on EC2 instances.

This post covers how to enable custom detail monitoring and collect memory and disk metrics using AWS CloudWatch agent, later you can build custom CloudWatch dashboards using these metrics.

Note: You can also monitor EC2 instances using older Perl scripts click here for more info

In summary, you need to do the following:

  1. Create CloudWatch Role
  2. Assign CloudWatch Role to EC2 Instance
  3. Install CloudWatch agent on the EC2 Instance
  4. Configure Metrics
  5. Start CloudWatch agent
  6. Create CloudWatch Dashboards

Click the Image or the following link to learn more detail steps





About DataNext

DataNext Solutions is US based system integrator, specialized in Cloud, Big Data, DevOps technologies. As a registered AWS partner, our services comprise of any Cloud Migration, Cost optimization, Integration, Security and Managed Services. Click here and Book Free assessment call with our experts today or visit our website www.datanextsolutions.com for more info.


Thursday, May 3, 2018

Assigning Public IPs to AWS EC2 Instances

Share this Post

Quick post today, recently a customer asked us that they have created a new VPC in AWS but when they launch an EC2 instance in a public subnet, they don’t see a public IP assigned to it.

To assign automatically public IPs, there are few things to look into when you launch an EC2 instance as follows



About DataNext

DataNext Solutions is US based system integrator, specialized in Cloud, Big Data, DevOps technologies. As a registered AWS partner, our services comprise of any Cloud Migration, Cost optimization, Integration, Security and Managed Services. Click here and Book Free assessment call with our experts today or visit our website www.datanextsolutions.com for more info.

Saturday, April 28, 2018

Copy EC2 Instances from one account to another

Share this Post

Overview

Recently one of our customers came up with a requirement to merge assets into one single AWS account, there are some other ways such AWS Organization to manage multiple AWS accounts but in this case, the requirement was clear to move EC2 instances from one account to another.

The solution to this requirement was quick straightforward and convenient from AWS. To summarize you need to do the following
  1. Obtain AWS account ID where you want to copy/move/migrate the EC2 instance 
  2. On source AWS account, create AMI from existing EC2 instance
  3. Grant access permissions on AMI to target AWS account
  4. Log in to target AWS account and launch a new EC2 instance from AMI
For a detailed solution with screenshots visit original post: https://datanextsolutions.com/blog/aws-how-to-copy-ec2-instances-to-another-account/

Cheers,
Zeeshan Baig

Saturday, April 16, 2016

My Slides from Collaborate 2016

Share this Post
Geeks,

Following are my slides from Collaborate 2016 in Las Vegas.

My first session 'Architecting for the Cloud' was full house and it was great to see audience response as many of them asked me about the slides after the session. Unfortunately the OES session got low attendance as it was scheduled to be last session on Wednesday and many people left for the party.

Here are my slides from the it. See you sometime in future.

Architecting for the Cloud: Best Practices







Build Fine-Grained Authorization for WebCenter Using Oracle Entitlements Server (OES)