Pages

Subscribe:

Thursday, October 17, 2019

Webinar & Training





Unixcloudfusion is going to organise a Webinar on learning the Devops Ecosystem , Pipelines and Value chain on 29th November 2019. It will be a 2:30 hour webinar with 1:30hour dedicated to the below listed Topics and 1 hour the question and answer round where the trainer is going to answer most of your questions, doubts and queries related to Devops.

The Webinar will focus on the following topics

1. Introduction to Understanding Devops Ecosystem, Infrastructure as Code and associated Tools
2. Introduction to the Continuous Build and Delivery pipelines
3. Introduction to the Containerisation, Docker and Kubernetes Orchestration Ecosystem
4. Job oppurtunities and Career Growth with Advanced Training & Certification
5. Prerequisites and Target Audience for the Advanced Devops Training

REGISTER FOR FREE WEBINAR


The Webinar is especially suitable for everyone who want to kickstart there career and growth in the field of devops which has revolutionalise the industry and the companies who are following these practices in there day to day activities.

With introduction of cloud, hosted applications, mobile applications, multi cloud availability, Big data technology, Machine learning, IOT the huge amount of data which is generated has only increased the scope of the Devops in order to take the full potential from these advanced technologies.

This is further growing as more and more companies are adopting the devops practices. Our trainer is handson on the topic of Devops and has a experience of more than 10 years in the field of Devops. He has undertaken migration of applications from onprime to cloud, between multi regions , handling and processing terrabytes and even petabytes of data and much more.

The webinar will help you to evaluate whether the Devops is the field you want to move forward as the large number of opportunities every single day is generated in the field of devops. If you are determined to learn the devops go ahead and take this webinar from us for free.

We are also providing the certificate of participation to the users who take this webinar and will be sending them once you have successfully taken the webinar over your email.

If you have any doubts feel free to email on [email protected]xcloudfusion.in .


Wednesday, October 16, 2019

Kubernetes Important Commands And Concepts

1. Listing all the resources in the kubernetes you can use
kubectl api-resources -o name (which lists the resources according to the name only)

2. Spec
Every object in kubernetes has a specification provided by the user which defines the state for that object to be in.

3. Status
Status represents the current actual state of the object. Kubernetes matches the spec to achieve the desired state specified in the spec

4. kubectl get:- to get the list of objects in kubernetes. For e.g kubectl get pods -n kube-system , kubectl get nodes
you can get more detailed information about a object like
kubectl get nodes kube-node1 -o yaml (yaml representation of this object)

5. kubectl describe kube-node1 (Readable overview about an object but not the yaml format)

6. Pods can contain one or more containers and a set of resources shared by those containers. All containers in kubernetes are part of a pod.

Example of pod Yaml https://github.com/ankit630/IAC/blob/master/kubernetes/pods/ex-pod.yml

7. kubectl create -f ex-pod.yml (Its going to create the pod in the kubernetes cluster)

8. kubectl apply -f ex-pod.yml (Any changes like change in existing container can be applied to existing container)

9. kubectl edit pod ex-pod (Apart from apply edit can also be used to edit pod and saving file will autoamtically apply changes)

10. kubectl delete pod ex-pod (Used to delete the existing pod)

11. Namespace allows to organize the objects in cluster with every object belonging to a namespace and when no namespace is defined it automatically goes to default namespace.

12. kubectl get namespaces (list the namespaces in cluster)

13. kubectl create ns ex-ns (creates the ex-ns namespace in kubernetes)

14. kubectl get pods -n ex-ns (list pods in example namespace)

Part 2 Using Athena to query S3 buckets

This is in continuation to my previous post on how can use the Athena to query the s3 buckets storing the cloudtrail logs in order to better organize your security and compliance which is hard thing to achieve in a legacy/large accounts with number of users.

Question:- Identifying the last 100 most used IAM Keys. Usually IAM roles is better approach to be used than using the IAM keys for the authentication as IAM roles can rotate the keys after every 15minutes thus making hard to intercept the keys and increasing the security of the Account.

Answer
 SELECT  
  useridentity.accesskeyid,

  useridentity.arn,

  eventname,

  COUNT(eventname) as frequency

 FROM account_cloudtrail

 WHERE sourceipaddress NOT LIKE '%.com'

   AND year = '2019'

   AND month = '01'

   AND day = '01'

   AND useridentity.accesskeyid LIKE 'AKIA%'

 GROUP BY useridentity.accesskeyid, useridentity.arn, eventname

 ORDER BY frequency DESC

 LIMIT 100 

Friday, October 11, 2019

Part 1 Using Athena to query S3 buckets

While its great to push all the logs data gathered from various sources like your load balancers, cloudtrail, application logs etc to the S3 buckets. But as your infrastructure grows in size it becomes difficult to analyze such huge amount of data of months or year.

You can use the Athena service of the Amazon AWS to query the S3 service data without the need of downloading and processing it manually. This saves the requirement of extra processing, space requirement etc. We are going to cover the query details of most of the effective queries which can help you analyze and meaningful information from your s3 logs data.

 Question:- Identifying all the users,events,accounts accessing a particular s3 bucket  
 Answer:-
 SELECT DISTINCT  
    account,

    eventname,

    useridentity.arn,

    useragent,

    vpcendpointid,

    json_extract_scalar(requestparameters, '$.bucketName') AS bucketName,

    sourceipaddress

 FROM unixcloudfusion_cloudtrail

 WHERE year = '2019'

  AND month = '10'

  AND day = '09'

  AND eventsource = 's3.amazonaws.com'

  AND json_extract_scalar(requestparameters, '$.bucketName') = 'unixcloudfusion.analytics' 


Thursday, October 10, 2019

Command Logging & Kibana Plotting


Problem Statement : Monitor & track all the activities/commands used by user on system


Minimum Requirement(s):   1) Required separate ELK cluster for command logging
                                               2) Required Snoopy Logger Agent on all client machines.
                                               3) Required File beat agent.


Context: In order to track what all commands are being fired by users , we''ll be needing bash_history of that specific user it becomes tedious task when we have to track specific user (or multiple user)in different machines

Solution:  Snoopy Logger is a tiny library that logs all executed commands (+ arguments) on your system.

Below is the link for more information on snoopy which includes installing snoopy logger as well.
https://github.com/a2o/snoopy

  Through Snoopy logger we will be getting one single file for all command hit by any user ,you can specify message format  and filter chain for filtering  logs in snoopy based on message format we need to create grok in logstash , we can also exclude some repetitive internal command by drop filter in logstash format for excluding command is given below :

filter {
 if [command] == "command-name" {
   drop {
      percentage => 100
    }
  }
}