Saturday, 24 May 2025

Git

1 git add

↳ It lets you add changes from the working directory into the staging area

2 git commit

↳ It lets you save a snapshot of currently staged changes in the local repository, with a message

3 git push

↳ It lets you upload commited changes from the local repository to a remote repository

4 git fetch

↳ It lets you download changes from a remote repository, without applying them locally

5 git merge

↳ It lets you combine changes from one branch into another

6 git pull

↳ It lets you fetch and then merge changes from a remote repository into the local branch

7 git diff

↳ It lets you see the changes not staged or commited yet

8 git diff HEAD

↳ It lets you see changes between the current working directory and the latest commit

9 git status

↳ It shows you the current state of the working directory and staging area

10 git branch

↳ It lets you see all local branches

11 git checkout

↳ It lets you create a branch or switch between branches

12 git log

↳ It shows you commits in the current branch with extra details

13 git stash

↳ It lets you temporarily save uncommited changes and apply them later

14 git rebase

↳ It lets you apply commits from one branch to another

15 git reset

↳ It lets you undo changes in the working directory and move back to a specific commit

16 git revert

↳ It lets you undo changes by creating a new commit

17 git cherry pick

↳ It lets you apply commits from one branch to another

———

- Working directory

↳ It's the directory where you make changes to the files

- Staging area

↳ It lets you control which changes go into a commit

- Local repository

↳ It's the version of a project stored on the local machine of a user

- Remote repository

↳ It's the version of a project stored on a server, such as GitHub

Wednesday, 20 December 2023

Linux command to check CPU Utilization

 awk '{u=$2+$4; t=$2+$4+$5; if (NR==1){u1=u; t1=t;} else print ($2+$4-u1) * 100 / (t-t1) "%"; }' \

<(grep 'cpu ' /proc/stat) <(sleep 1;grep 'cpu ' /proc/stat)

Wednesday, 25 January 2023

Useful notes for CKA exam

 

## Shortcuts to save time 

export do="--dry-run=client -o yaml"

export now="--force --grace-period 0"

k delete pod x $now

alias kn='kubectl config set-context --current --namespace '

kn my-namespace 


## JOIN KUBE CLUSTER ##

kubeadm token create --print-join-command


## Check Service CIDR ##

ssh cluster1-controlplane1

cat etc/kubernetes/manifests/kube-apiserver.yaml | grep range


## CNI Plugin is configured and where is its config file ##

find /etc/cni/net.d/

cat /etc/cni/net.d/10-weave.conflist


## Show Latest events ##

kubectl get events -A --sort-by=.metadata.creationTimestamp


### find pod running on cluster2-node1

k -n kube-system get pod -o wide | grep proxy 


## killing pod on a node 

  crictl ps | grep kube-proxy

        crictl stop e6fe93fbaec50

        crictl rm e6fe93fbaec50

        crictl ps | grep kube-proxy


## Write the names of all namespaced resources into file 

 k api-resources --namespaced -o name > /opt/course/16/resources.txt


## Check which namspace has more role (count )

  k -n <namespacename> get role --no-headers | wc -l


cat <<EOF | kubectl apply -f -

---

apiVersion: rbac.authorization.k8s.io/v1

kind: Role

metadata:

  name: app-role

  namespace: project-hamster

rules:

  - apiGroups:

        - ""

        - apps

        - autoscaling

        - batch

        - extensions

        - policy

        - rbac.authorization.k8s.io

    resources:

      - configmaps

      - secrets


    verbs: ["get", "list", "watch", "create", "update", "patch", "delete"]

EOF



cat <<EOF | kubectl apply -f -

---

apiVersion: rbac.authorization.k8s.io/v1

kind: RoleBinding

metadata:

  name: app-rolebinding

  namespace: project-hamster 

roleRef:

  apiGroup: rbac.authorization.k8s.io

  kind: Role

  name: app-role 

subjects:

- namespace: webapps 

  kind: ServiceAccount

  name: processor

EOF

Tuesday, 26 July 2022

Creating EKS cluster

 https://docs.aws.amazon.com/eks/latest/userguide/create-cluster.html

Checking EKS status using aws cli

 


aws eks describe-cluster \

    --region us-east-1 \

    --name my-cluster \

    --query "cluster.status"

Creating EKS using aws CLI

 aws eks create-cluster --region us-east-1 --name my-cluster --kubernetes-version 1.22 \

   --role-arn arn:aws:iam::73095:role/AmazonEKSClusterRole \

   --resources-vpc-config subnetIds=subnet-0814938360f2fdf5b,subnet-040ad1e2b5ffe4775



KUBE master setup

 root@kubernetes-master:/home/ubuntu# history

    1  swapoff -a

    2  exit

    3  swapoff -

    4  swapoff -a

    5  rm -rf /swap.img

    6  exit

    7  sudo apt update

    8  sudo apt install docker.io

    9  sudo systemctl enable docker

   10  sudo systemctl start docker

   11  sudo systemctl enable docker

   12  sudo apt install apt-transport-https curl

   13  curl -s https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add

   14  sudo apt-add-repository "deb http://apt.kubernetes.io/ kubernetes-xenial main"

   15  sudo apt install kubeadm kubelet kubectl kubernetes-cni

   16  sudo swapoff -a

   17  vi /etc/fstab

   18  sudo hostnamectl set-hostname kubernetes-master

   19  sudo kubeadm init

   20  mkdir -p $HOME/.kube

   21  sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config

   22  sudo chown $(id -u):$(id -g) $HOME/.kube/config

   23  kubectl apply -f https://raw.githubusercontent.com/coreos/flannel/master/Documentation/kube-flannel.yml

   24  kubectl apply -f https://raw.githubusercontent.com/coreos/flannel/master/Documentation/k8s-manifests/kube-flannel-rbac.yml

   25  kubectl

   26  kubectl get pods --all-namespaces

   27  kubectl get nodes

   28  kubectl get pods --all-namespaces

   29  kubectl

   30  sudo swapoff -a

   31  kubectl get nodes

   32  sudo -i

   33  kubectl get nodes

   34  ufw allow 6443/udp && ufw allow 6443/tcp

   35  kubectl get nodes

   36  sudo -i

   37  strace -eopenat kubectl version

   38  kubectl get nodes

   39  kubectl

   40  sudo dockerf ps

   41  sudo docker ps

   42  kubectl get src

   43  reboot

   44  kubectl get nodes

   45  history

KUBE worker SETUP

 root@kubernetes-worker:/home/ubuntu# history

    1  sudo apt update

    2  sudo apt install docker.io

    3  sudo systemctl start docker

    4  sudo systemctl enable docker

    5  sudo apt install apt-transport-https curl

    6  curl -s https://packages.cloud.google.com/apt/doc/apt-key.gpg | sudo apt-key add

    7  sudo apt-add-repository "deb http://apt.kubernetes.io/ kubernetes-xenial main"

    8  sudo apt install kubeadm kubelet kubectl kubernetes-cni

    9  sudo swapoff -a

   10  sudo hostnamectl set-hostname kubernetes-worker

   11  hostname


   21  kubeadm join 172.3.230:6443 --token b683jdx2fpzby --discovery-token-ca-cert-hash sha256:57151584e5bfdc0a7168cadc19b298f5a5aab3f3ad90f5d0a2921d54

   22  history

root@kubernetes-worker:/home/ubuntu#

Saturday, 23 January 2021

Algorithms trading live stock data

This is what I found on the internet:


There is no free lunch here in the data segment.


1)You have to get into the datafeed agreement with NSE. And Realtime datafeed is quite costlier as per year charges come around Rs: 20 lakh per exchange + Service Taxes.


2)Per user fee will be applicable depending upon how you are planning to monetize the datafeed. Normally per use fee for NSE cash/futures segment is Rs: 360/user and first 300 users will be waived off.


3)You have to define yourself as data vendor or data redistributor or you are planning to show the exchange data at free of cost.


4)You can lower you cost by getting the data from data redistributor(e.g globaldatafeeds) as normally your fee would be reduced by 20%. However still you have to get data vending agreement with NSE exchange. Without NSE’s permission you are not allowed to vend the data or redistribute the data even if you are doing it at free of cost. Otherwise you will be tagged as a illegal data vendor as per DOTEX.


5)Only Data Redistributor (aka Authorized data vendors like globaldatafeeds, esignal, bloomberg...etc) develop their own API not the exchange itself.


6)However if you are looking for low cost options then you can prefer snapshot data or 15min delayed datafeeds. And the same above rules are applicable for most of the exchanges.

Friday, 22 January 2021

Ansible agent like Chef

 

By default ansible does not have agent .


We can use SDP solutions like zerotier as alternative.

Tuesday, 19 January 2021

My First Flight (The real bird’s-eye view)

Ha ha !!!!!!!!


Memories are still fresh !!!!!


it's been so long as a fresher I joined an IT company as a software engineer at Bengaluru. I used to trip between cities via bus or train.


I was waiting for a chance to travel by flight which never happened.


One fine day I got a call from my manager that I would be travelling to Chandigarh for some project work and I need to report in a week time.


Next day was thinking about how to travel it’s almost 2500 km travel and I received a call from my manager asking travel details and I informed him of the concerns.


Manager giggled and said that I would be travelling by flight and expenses will be taken care of. I was so happy and bit nervous to travel by flight as I need to travel alone and that was my first time to travel that distance that too by flight.


Somehow I managed to book a flight and I reached the airport And the entire environment was new to me the boarding protocols and the security checks.


I got my boarding pass and went to that lobby and I was told that my flight will departure in next half an hour and I was waiting for that flight.

Cleared the checks and the moment I entered into the flight all those nervousness increased and my heart started pumping. got a window seat took few pictures through flight windows asked someone to take a pic of mine.  

Somehow I got adjusted To the environment everyone on got settled on their seats and Air hostess started instructing the safety instructions. Caption introduced himself and announced details about the journey.


Flight moved slowly on the runway and gradually increased speed OMG it was an awesome feeling the flight airborne. It was a smooth take-off and I saw the city from the sky the real BIRD’s-EYE view. 




I took snaps of city, clouds, Blue-sky just enjoyed every minute of flying journey continued for an hour and the flight landed with the thrill of its own and I got down from the flight. Yeah, that is one of best first-time experience of my life. 



Monday, 18 January 2021

Algorithmic Trading overview


Algorithmic trading means using computer to make investment decisions


Main players of algorithmic trading


Renaissance technologies

AQR capital management

Citadel Technologies


Python is the most popular programming language used for algorithmic trading 


Libraries

NumPy

Pandas

Requests

Xlsxwriter

Math


Algorithmic trading process 


1.collect data

2.develop a hypothesis for a strategy

3.Back test that Strategy

4.implement the strategy on production

Thursday, 7 January 2021

Python script to convert json to csv

 import json

import csv

import time

import logging

import datetime as dt


json_file_path = "D:\.json"

csv_file_path = "D:\.csv"

global_key = []

global_val = []

final_values = [{}]


print ('Device Details script  started at:%s'%(dt.datetime.utcnow()))


def to_string(s):

    try:

        return str(s)

    except:

        return s.encode('utf-8')

def set_final_values(final_key, final_val, path_key_str):

    flag = 0

    b = 0

    for a in final_key:

        try:

            b = int(a)

            flag = 1

            try:

                len((final_values)< b+1)

            except:

                final_values.append({})

        except:

            continue

    logging.debug("Keys and Values updated and assigning to final_values list")

    final_values[b][path_key_str] = final_val


def fetch_key_value(input_json, path_key = None, path_val = None, last_key = None):

    if path_key is None:

        path_key = []            

        path_val = []


    if last_key is not None:

        path_key = path_key + [last_key]

       

    if type(input_json) is dict and input_json:

        for key, val in input_json.iteritems():

            if(type(val) is not dict and type(val) is not list):

                path_val = path_val + [to_string(val)]

            if type(input_json[key]) is dict or type(input_json[key] is list):

                path_key, path_val = fetch_key_value(input_json[key], path_key, path_val, key)

        if len(path_key)>0:

            path_key.pop()

    elif type(input_json) is list and input_json:

        for entity in input_json:

                path_key, path_val = fetch_key_value(entity, path_key, path_val, input_json.index(entity))

        if len(path_key)>0:        

            path_key.pop()

    else:

        if path_key:

            global global_key

            global global_val

            val = []

            val = path_val

            tmp_val = ""

            path_key_str = ""

            for i in path_key: 

                if(path_key_str == ""):

                    path_key_str = to_string(i)

                else:

                    if not isinstance(i,int):         

                        path_key_str = path_key_str + "_" + to_string(i) 

            global_key = global_key + [path_key_str]

            if val:

                tmp_val = str(val.pop())

                set_final_values(path_key, tmp_val, path_key_str)

            path_key.pop()

            global_val = global_val + [tmp_val]

    return path_key, path_val 

def enter_header(header):

    final_list = []

    for x in header:

        if x not in final_list:

            final_list.append(to_string(x))

    print("Header added.")

    logging.debug("Header added.")

    return final_list 


def enter_csv_row(rows, csvwriter):

    for x in rows:

        if x != {}:

            csvwriter.writerow(x)

    print("Rows added.")

    logging.debug("Rows added.")


def format_data(local_key, local_values):

    with open(csv_file_path,'wb') as csv_file:

        header = enter_header(local_key)    

        csvwriter = csv.DictWriter(csv_file, fieldnames = header)

        csvwriter.writeheader()

        enter_csv_row(local_values, csvwriter)

try:

    with open(json_file_path, 'r') as json_file:

        json_data = json.load(json_file)

        json_keys, jason_vals = fetch_key_value(json_data)

        format_data(global_key, final_values)

        print("CSV created successfully.")

        logging.debug("CSV created successfully.")

except Exception as e:

    print ("Exception =>" + str(e))

    logging.debug("Exception =>" + str(e))

Monday, 4 January 2021

Home loan process

 

Fill The Loan Application Form & Attach The Documents

Pay The Processing Fee

Discussion With The Bank

Valuation Of The Documents

The Sanction/Approval Process

Processing The Offer Letter

Processing The Property Papers Followed By A Legal Check

Processing A Technical Check & The Site Estimation

The Final Loan Deal

Signing The Agreement

The Loan Disbursal

Monday, 14 December 2020

Python regex to match start and end of sentence

 


#Check if the string starts with "The" and ends with "chennai":

txt = "The rain in chennai"

x = re.search("^The.*chennai$", txt)

Saturday, 28 November 2020

Why do we need weekend ??

 

  1. To improve on yourself as a person rather than always improving your professional skills.
  2. To re-energize yourself, when you go back after weekend, you are more relaxed and productive. A couple of days are really required to forget the rudeness of your office colleagues and managers.
  3. To spend time with friends and family, and to plan a weekend trip in a nearby place. It is also important to keep strengthening the bonds with the others around you, and these days you will always remember even when you retire.
  4. To spend time alone and learn some new skills. You can meditate and learn about spirituality, or you can enhance your skills in something else, which can help you, if the current technology you are working in, goes outdated. You will come out to be a more confidant and secured person.

Monday, 26 October 2020

Hackathon flow

 


Flow 

Register -> Create Teams -> View Problems/Deadlines/Leaderboard -> Submit Solutions -> Results

Git

1 git add ↳ It lets you add changes from the working directory into the staging area 2 git commit ↳ It lets you save a snapshot of currently...