Monday, March 1, 2021

Cloud Comparisons by Resource

 AWS                                                                Azure                                            Google



(1) ADF Trigger invokes API through ADF Web Activity based on schedule

(2) App Service invoke IM Datalink Service and process the data using transformation logic

(3) App Service Generate File (FIX) format and send to COMET SFTP Server

(4) Pager Duty integrated with APIM(Amber) so all notifications can be centralized for connectiviy failures

(5) Developer invoke the API to perform end to end testing on developer machine

(6) AZDO is being used for both Application code and infra code deployments in CI/CD fashion

Azure: Setup OAuth2.0 between ADF and APIM for secured and authenticated traffic

  •  Create Service Principal - to be presented to APIM for Auth token
    • Create App Registration
    • Under Token configuration, add optional claim to ensure only app call end point not the user/human
    • No explicit permission needed
    • Add the Application URI
    • No explicit scopes needed
    • No RBACs as resources level as this is for API not users

    • SPN Name: dept-proj-env-oauth
    • SPN API Permissions: none
    • SPN RBAC permissions: none
    • SPN RBAC permission scope: none

  • Create Policy for APIM to validate request
    • Write the policy, 
    • Route the backend if request successful
  • ADF pipeline to use api://imdev-dev part of the REST request ( which is also application URI for SPN)

Thursday, August 6, 2020

Azure Databricks

 It is an Apache Spark based analytics paltform optimized for Microsoft axzure



key features

  1. Spark SQL lirabry
  2. Streaming services for OIT et
  3. MLIB machine leaning libs
  4. Graph Computation


Spark core API supports

  1. R
  2. SQL
  3. Python
  4. Scala
  5. Java

Azure Databricks as a pltform contains.

  1. Datbricks workspace
  2. Datbrcisk workflows
  3. Databricks Runtime
  4. Databricks I/O
  5. Datbricks Serverless
  6. Dataricks Enterprise Security (DBES)
Storage solutions supported
  1. Blob storage
  2. Data lake
  3. SQL DW
  4. Apache kafka
  5. Hadoop
Applications Supported
  1. ML
  2. Streaming
  3. Data 
  4. Power BI
  5. Others
Users
  • Data Scientist
  • Data engineers
  • Analysts
  • othes



Azure Data Factory (Rest End Point one side and Storage in other end point)

1. Need to have REST API2. Create Data Lake Storage3. Create Azure Data Factory4. Create App registration to access data from Data Lake


Get the URL 
Test URL like Postman or something.

Create data lake storage
Create app registration
Create two linked services ( one for rest end point and other for data lake storage)

RStudio Setup

 

Windows

Once R and RStudio are installed, open RStudio to make sure that you don’t get any error messages.

Saturday, February 22, 2020

AI - Important Terminology



SOLID Principles

S - Single Responsibility Principle

O - Open/Closed Principle

L - Liskov Substitution Principle

I - Interface Segregation Principle

D - Dependency Inversion  Principle 

Friday, February 14, 2020

HadoopSetup

http://www.hadoopwizard.com/install-hadoop-on-windows-in-3-easy-steps-for-hortonworks-sandbox-tutorial/

https://hortonworks.com/downloads/#sandbox

https://www.youtube.com/watch?v=7sxqHgBdxB8

https://www.youtube.com/watch?v=xWgdny19yQ4

https://www.virtualbox.org/wiki/Downloads


option-1
------------
https://www.virtualbox.org/wiki/Downloads

Blockchain - Steps to Setup Environment

1) install tuffleframeword.com/ganache

2) start it..

3) check npm version

4) install python 3.7.4 for windows

5) npm -g install truffle@5.0.2

6) install metamask google echrome extension --> allows to connect to BC personalle account


7) truffle version

8) truffle init

9) packahe.jso

10) Ganache,, from truffle website

11) Metamask extension from google

12) syntax highlighting for solidity.. SUbmile text

------------------------------------------
1) create the app folder    --> mkdir election

2) unbox pet-shop project from truffle --> truffle unbox pet-shop


Docker Commands

ddocker display version
ddocker login
ddocker pull name-of-the-image
ddocker pull webdevops/php-apache
ddocker search laravel
ddocker images
ddocker rmi IMAGE-ID
ddocker run IMAGE-ID
ddocker ps
ddocker stop CONTAINER-ID
ddocker ps -a
ddocker rm [ID]