Azure From Scratch 3
Intro & Setting up cloud
mind-Set
girishkrao.portfoliobox.net
theazureguys.wordpress.com
twitter.com/TheAzureGuy007
facebook.com/TheAzureGuy007
https://github.com/TheAzureGuy007
https://www.linkedin.com/in/girish-kalamati-357a6398/
https://www.youtube.com/channel/UCd9z6-2mZdqjRnAHh3W_9Uw
Azure Functions(Azure) or Lamda Functions (AWS)
Traditional Technical Architecture
Serverless Architecture
https://azure.microsoft.com/en-in/pricing/details/app-service/
Think of a Real World Scenario where we can post “Build Pipeline event” info to “SLACK”
https://azure.microsoft.com/en-us/services/functions/
Azure Functions : Github Webhook Trrigers functions whenever comment is issued
Azure Functions : Store unstructured data using Azure Functions and Cosmos DBDemo
Get function URL: https://myfunctionappp.azurewebsites.net/api/Github-Webhook-JS?clientId=default
GitHub Secret : M4ZU1EmyWrZ/jimZkg*****************************ZJR9jg==
Azure Functions : Webhook + GITHub
https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-github-webhook-triggered-function
Learn how to create a function that is triggered by an HTTP webhook request with a GitHub-specific payload
Add the details in your own GitHub
Repo :
https://github.com/TheAzureGuy007/
The-Azure-Guy-Repo/
settings/hooks/new
Azure Functions : Webhook + GITHub
Azure Functions : Webhook + GITHub
Azure Cosmos DB
Globally distributed mission-critical applications
Guarantee access to users around the world with the high-availability and low-latency capabilities built into
Microsoft’s global datacenters.
Azure Cosmos DB
IoT
Instantly, elastically scale to accommodate diverse and unpredictable IoT workloads without sacrificing ingestion or query
performance.
Store unstructured data using Azure Functions and
Cosmos DB
Test the setup
Check the data would have pushed
Check the data would have pushed
Check the data would have pushed
App Service Plan Definition
Azure PaaS and Azure Stack
Application Platform Offerings
There three different methodologies you have available for running your applications on the Microsoft platform, and
they each give different levels of control and autonomy, yet still provide the flexibility, availability and cost savings
that are associated with PaaS workloads.
•Azure App Service: allows you leverage the benefits of a PaaS solution
•App Service Environment: allows you to leverage Azure App Services in a more isolated and salable environment
•Azure Stack: allows you have Azure in your data center and use the same controls and architecture as Azure while
still maintaining your on-premises control. It can help you prepare move more seamlessly into Azure if you wish to
do so as they are the same environment, one in your data center and one in the cloud.
Managing Azure Apps Services
Management Tools
There are a variety of tools you can use to interact and manage Azure App Services, Below are the main ones that
are typically available throughout Azure that are also applicable with Azure App Services
•Azure PowerShell: set of modules providing PowerShell cmdlets. Can be run on Windows, Linux or MacOS
•Azure Command Line Interface (CLI):open source command line shell. Can be run on Windows, Linux of MacOS
•REST APIs: REST based APIs for Azure resource manager (ARM)
•Templates: Resource Manager templates, to define resource objects and automate deployment and configuration
•Azure Portal:
• New Portal: portal.azure.com
• Classic Portal: manage.windowsazure.com
Managing Azure Apps Services
Managing Azure Apps Services
U can run Curl (File transfer cmd’s) or Git
commdands
Locking Resources
Creating a Lock
In Azure App Services It is possible lock a subscription, resource group, or resource such as your web application, to
prevent other users from deleting or modifying it. You can set the lock level to
•CanNotDelete: means authorized users can still read and modify a resource, but they can't delete it.
•ReadOnly: means authorized users can read from a resource, but they can't delete it or perform any actions on it. The
permission on the resource is restricted to the Reader role.
Locks differ from Role Based Access control in that locks apply to all users and roles.
Configure Custom Domain Name
It is possible to create your own custom domain name and use it with your Web App hosted on Azure App
Services. You can purchase a domain name directly through the Azure App services portal or you can bring
your own
Adding Site Extensions
Its possible to add extensions to your Web App. This can extend functionality and ease management overhead
and help in a number of different ways depending on your own requirements. There are a lot of extensions
available, calling out just a few here which help extend functionality and improve monitoring capabilities
Application insights: provides monitoring capabilities
New Relic: provides monitoring capabilities, more details on the project page
Php Manager: tool for managing php installations. More details are available on the PHP Manager Gitub page
Jekyll: Adds support for Jekyll on a Web App. More details are available on the Jekyll Site Extension Gitub page
App Service Deployment Options
App Service Deployment Options
There are a number of different options
available for deployment
Basic:
• FTP
• Web Deploy
Alternative:
• OneDrive/DropBox
• Kudu
Source Control / Continuous
Deployment
• Visual Studio Online
• Local Git
• GitHub
• BitBucket
You can use various tools as part of
these such as powerShell, Azure CLI
Data Services in Azure Overview
There are a number of different options to meet your needs for Data services in Azure.
• SQL Databases: Based on SQL sever, provides relational database server with scale, performance and
availability, as well as integration with existing on-premises SQL Server workloads for hybrid implementations.
• SQL data Warehouse: A combination of SQL server relational db with Azure cloud scale-out capabilities.
Suitable for enterprise, large scale workloads.
• Document DB: A schema free NoSQL database service, highly scalable and available.
• Table Storage: Stores structured NoSQL data with no schema. lower cost option than SQL. Could be suitable
for user data for web apps, address books, device information etc.
• Redis Cache: Provides access to a Redis cache, accessible by any application within Azure, providing high
throughput and low-latency for application requiring speed and scale.
• Data Factory: Manages movement and integration of data. Assists integrating different sources and different
types of data.
• Data Lake: A collection of services that allows for the storing, managing and analysis of large amounts of
data, getting the most out of the data you have
Azure SQL (PaaS) Vs SQL Server (IaaS)
Azure SQL Database (PaaS) Vs SQL Server in a VM (IaaS)
Service Tiers
There are three different Service Tiers to accommodate various workload requirements. All provide an up-time
SLA of 99.99% and hourly billing.
The service tiers are
• Basic: Suitable for small databases, and low volume needs
• Standard: Suitable for most cloud based apps
• Premium: Suitable for high transnational volumes with super critical workloads
Within each of these top level tiers, there are various performance levels available. It is possible to change
service tiers and performance levels dynamically
What is a DTU?
A Data Transnational Unit (DTU) is a
measure of the resources that are
guaranteed to be available to a
standalone
• Azure SQL database at a specific
performance level within a service Tier.
• It is a measure that combines CPU,
memory and I/O values.
• The larger the number the better the
performance, but this unit of measure
provides a way for you to see what
your overall performance levels are
and what your needs are, then being
able to relate that to cost.
An elastic DTU (eDTU) is a measure of the
resources across a set of databases, called an
elastic pool.
The general steps you should follow are
• Test for compatibility: validate the database compatibility
• Fix Compatibility issues if found
• Perform the migration
There are a number of options available to help with the process of migration depending on whether
you can afford some down time or not.
If you need minimal down time you can use SQL Server transactional replication replicate your data.
If you can accept some down time, some of which are
• Use the built in Deploy Database to Microsoft Azure Database Wizard
• Export to DAC package and ImportDAC package in Azure SQL
• If you just want the schema you can generate a script for entire database schema using Transact SQL
Migrating a SQL database
Method 1: Migration with downtime
1.Assess the database for any compatibility issues
using the latest version of Data Migration Assistant
(DMA).
2.Prepare any necessary fixes as Transact-SQL
scripts.
3.Make a transactionally consistent copy of the
source database being migrated - and ensure no
further changes are being made to the source
database (or you can manually apply any such
changes after the migration completes). There are
many methods to quiesce a database, from
disabling client connectivity to creating a database
snapshot.
4.Deploy the Transact-SQL scripts to apply the
fixes to the database copy.
5.Export the database copy to a .BACPAC file on a
local drive.
6.Import the .BACPAC file as a new Azure SQL
database using any of several BACPAC import
tools, with SQLPackage.exe being the
recommended tool for best performance.
The following list contains recommendations for best performance during the import process.
• Choose the highest service level and performance tier that your budget allows to maximize the
transfer performance. You can scale down after the migration completes to save money.
• Minimize the distance between your .BACPAC file and the destination data center.
• Disable auto-statistics during migration
• Partition tables and indexes
• Drop indexed views, and recreate them once finished
• Remove rarely queried historical data to another database and migrate this historical data to a
separate Azure SQL database. You can then query this historical data using elastic queries.
Optimizing data transfer performance during
migration
Method 2: Use Transactional Replication
1.Set up Distribution
1. Using SQL Server Management Studio (SSMS)
2. Using Transact-SQL
2.Create Publication
1. Using SQL Server Management Studio (SSMS)
2. Using Transact-SQL
3.Create Subscription
1. Using SQL Server Management Studio (SSMS)
2. Using Transact-SQL
Data Lake
What is Data Lake?
Data Lake is a batch, real-time, interactive data analysis tool. Data Lake makes it easy for developers, data
scientists, and analysts to store data of any size, shape and speed, and do all types of processing and
analytics across platforms and languages.
Azure Data Lake is a family of Azure services that enables you to analyze your big data workloads in a
managed manner.
It consists of these services:
• Azure Data Lake Store - A data repository that enables you to store any type of data in its raw format
without defining schema. The store offers unlimited storage with immediate read/write access to it and scaling
the throughput you need for your workloads. The store is Hadoop Data File System (HDFS) compatible so you
can use your existing tools.
• Azure Data Lake Analytics - An analytics service that allows you to run analysis jobs on data. Analytics using
Apache YARN to manage its resources for the processing engine. By using the U-SQL query language you can
process data from several data sources such as Azure Data Lake Store, Azure Blob Storage, Azure SQL
Database but also from other data stores built on HDFS.
• Azure Data Lake HDInsight - An analytics service that enables you to analyze data sets on a managed
cluster running open-source technologies such as Hadoop, Spark, Storm & HBase.
Task
Write your own Azure ARM Templates
1. Create a NEW ARM Template
2. Let’s Deploy Full PAAS solution
3. Populate website from GitHub Source
https://github.com/TheAzureGuy007/Custom-ARM-Templates
Azure Resources we will use
http://rickrainey.com/2016/03/21/deploy-an-azure-resource-manage-template/
Docker’s & Container’s
Hypervisor ~ Container’s
VMWare ~ Docker’s
Running Docker Machine/Client Commands in
MAC OSX
bash --login '/Applications/Docker/Docker Quickstart
Terminal.app/Contents/Resources/Scripts/start.sh'
Last login: Fri Sep 29 22:12:55 on ttys000
Girishs-Mac:~ girishkalamati$ bash --login
'/Applications/Docker/Docker Quickstart
Terminal.app/Contents/Resources/Scripts/start.sh'
## .
## ## ## ==
## ## ## ## ## ===
/"""""""""""""""""___/ ===
~~~ {~~ ~~~~ ~~~ ~~~~ ~~~ ~ / ===- ~~~
______ o __/
  __/
___________/
docker is configured to use the default machine with IP
192.168.99.100
For help getting started, check out the docs at
https://docs.docker.com
Girishs-Mac:~ girishkalamati$ docker-machine ls
Running Docker Machine Commands in QuickStart Terminal
Girishs-Mac:~ girishkalamati$ docker ps
Cannot connect to the Docker daemon at
unix:///var/run/docker.sock. Is the docker daemon
running?
Girishs-Mac:~ girishkalamati$ docker-machine env default
export DOCKER_TLS_VERIFY="1"
export DOCKER_HOST="tcp://192.168.99.100:2376"
export
DOCKER_CERT_PATH="/Users/girishkalamati/.docker/m
achine/machines/default"
export DOCKER_MACHINE_NAME="default"
# Run this command to configure your shell:
# eval $(docker-machine env default)
Girishs-Mac:~ girishkalamati$ eval $(docker-machine env
default)
Girishs-Mac:~ girishkalamati$ docker ps
CONTAINER
ID IMAGE COMMAND CREATED
STATUS PORTS NAMES
Girishs-Mac:~ girishkalamati$
Running Docker Machine Commands in Normal Bash
Girishs-Mac:~ girishkalamati$ docker pull hello-world
Using default tag: latest
latest: Pulling from library/hello-world
Digest:
sha256:b2ba691d8aac9e5ac3644c0788e3d3823f9e97f75
7f01d2ddc6eb5458df9d801
Status: Image is up to date for hello-world:latest
Girishs-Mac:~ girishkalamati$ docker images
REPOSITORY TAG IMAGE
ID CREATED SIZE
hello-world latest 05a3bd381fc2 2 weeks
ago 1.84kB
Girishs-Mac:~ girishkalamati$ docker run hello-world
Hello from Docker!
This message shows that your installation appears to be
working correctly.
Running Docker Client Commands
docker is configured to use the default machine with IP
192.168.99.100
For help getting started, check out the docs at
https://docs.docker.com
Girishs-Mac:~ girishkalamati$ docker ps -a
CONTAINER
ID IMAGE COMMAND CREATED
STATUS PORTS NAMES
Girishs-Mac:~ girishkalamati$ docker images
REPOSITORY TAG IMAGE
ID CREATED SIZE
hello-world latest 05a3bd381fc2 2 weeks
ago 1.84kB
Girishs-Mac:~ girishkalamati$ docker rmi 05a3
Untagged: hello-world:latest
Untagged: hello-
world@sha256:b2ba691d8aac9e5ac3644c0788e3d3823f
9e97f757f01d2ddc6eb5458df9d801
Deleted:
sha256:05a3bd381fc2470695a35f230afefd7bf978b56625
3199c4ae5cc96fafa29b37
Deleted:
Girishs-Mac:~ girishkalamati$ docker ps
Cannot connect to the Docker daemon at
unix:///var/run/docker.sock. Is the docker daemon
running?
Girishs-Mac:~ girishkalamati$ docker-machine env default
export DOCKER_TLS_VERIFY="1"
export DOCKER_HOST="tcp://192.168.99.100:2376"
export
DOCKER_CERT_PATH="/Users/girishkalamati/.docker/m
achine/machines/default"
export DOCKER_MACHINE_NAME="default"
# Run this command to configure your shell:
# eval $(docker-machine env default)
Girishs-Mac:~ girishkalamati$ eval $(docker-machine env
default)
Girishs-Mac:~ girishkalamati$ docker ps
CONTAINER
ID IMAGE COMMAND CREAT
ED STATUS PORTS NAMES
0f8a3ec0d161 kitematic/hello-world-nginx "sh
/start.sh" 16 minutes ago Up 16
minutes 0.0.0.0:80->80/tcp trusting_swanson
Running Docker Client Commands
Girishs-Mac:~ girishkalamati$ docker-machine ls
NAME ACTIVE DRIVER STATE URL SWAR
M DOCKER ERRORS
default * virtualbox Running tcp://192.168.99.100:2376
v17.09.0-ce
Girishs-Mac:~ girishkalamati$ dsenableroot
username = girishkalamati
user password:
root password:
verify root password:
dsenableroot:: ***Successfully enabled root user.
Girishs-Mac:~ girishkalamati$ npm install express express-
generator -g
/usr/local/lib
└── express@4.16.1
npm ERR! Darwin 15.6.0
npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm"
"install" "express" "express-generator" "-g"
npm ERR! node v6.11.3
npm ERR! npm v3.10.10
npm ERR! path ../lib/node_modules/express-
generator/bin/express-cli.js
Running Docker Machine/Client Commands in
Windows
Girishs-Mac:~ girishkalamati$ docker ps
Cannot connect to the Docker daemon at
unix:///var/run/docker.sock. Is the docker daemon
running?
Girishs-Mac:~ girishkalamati$ docker-machine env default
export DOCKER_TLS_VERIFY="1"
export DOCKER_HOST="tcp://192.168.99.100:2376"
export
DOCKER_CERT_PATH="/Users/girishkalamati/.docker/m
achine/machines/default"
export DOCKER_MACHINE_NAME="default"
# Run this command to configure your shell:
# eval $(docker-machine env default)
Girishs-Mac:~ girishkalamati$ eval $(docker-machine env
default)
Girishs-Mac:~ girishkalamati$ docker ps
CONTAINER
ID IMAGE COMMAND CREAT
ED STATUS PORTS NAMES
0f8a3ec0d161 kitematic/hello-world-nginx "sh
/start.sh" 16 minutes ago Up 16
minutes 0.0.0.0:80->80/tcp trusting_swanson
Running Docker Client Commands
PS C:UsersGirish> docker ps
error during connect: Get
http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.31/containers/js
on: open //./pipe/docker_engine: The
system cannot find the file specified. In the default daemon
configuration on Windows, the docker client must be run ele
vated to connect. This error may also indicate that the docker
daemon is not running.
PS C:UsersGirish> docker-machine env default
$Env:DOCKER_TLS_VERIFY = "1"
$Env:DOCKER_HOST = "tcp://192.168.99.100:2376"
$Env:DOCKER_CERT_PATH =
"C:UsersGirish.dockermachinemachinesdefault"
$Env:DOCKER_MACHINE_NAME = "default"
$Env:COMPOSE_CONVERT_WINDOWS_PATHS = "true"
# Run this command to configure your shell:
# & "C:Program FilesDocker Toolboxdocker-machine.exe"
env default | Invoke-Expression
PS C:UsersGirish> & "C:Program FilesDocker
Toolboxdocker-machine.exe" env default | Invoke-Expression
PS C:UsersGirish> docker ps
CONTAINER
ID IMAGE COMMAND CREATED STATU
S PORTS
PS C:UsersGirish> docker pull kitematic/hello-world-nginx
Using default tag: latest
latest: Pulling from kitematic/hello-world-nginx
77c6c00e8b61: Pull complete
9b55a9cb10b3: Pull complete
e6cdd97ba74d: Pull complete
7fecf1e9de6b: Pull complete
6b75f22d7bea: Pull complete
e8e00fb8479f: Pull complete
69fad424364c: Pull complete
b3ba6e76b671: Pull complete
a956773dd508: Pull complete
26d2b0603932: Pull complete
3cdbb221209e: Pull complete
a3ed95caeb02: Pull complete
Digest:
sha256:ec0ca6dcb034916784c988b4f2432716e2e92b995ac60
6e080c7a54b52b87066
Status: Downloaded newer image for kitematic/hello-world-
nginx:latest
Container’s
Old Days when we use to run app’s on Server’s
VMWare Revolution
Even VMWare had loopholes
Availability Set
Docker’s
https://github.com/docker
Docker Hub or Docker Store
https://store.docker.com/
Searching MongoDB Container in Docker Hub or Store
Container Registration
Container Registration
Publishing back the customized container
Publishing back the customized container
Publishing back the customized container
How Dockers can be useful for us ?
Full CI/CD pipeline to deploy a multi-container
application on Azure Container Service with
Docker Swarm using Visual Studio Team Services
AT first create a VSTS account in
case you do not have one
Authorize Via UI or PAT Token
Add Build step in build workflow
You need to add two Docker steps for each
image, one to build the image, and one to
push the image in the Azure container
registry
Bamboo (Continuous Delivery Tool)Build
Focus on coding and
count on Bamboo as
your CI and build
server! Create multi-
stage build plans, set
up triggers to start
builds upon commits,
and assign agents to
your critical builds
and deployments.
Test
Testing is a key part
of continuous
integration. Run
automated tests in
Bamboo to regress
your products
thoroughly with each
change. Parallel
automated tests
unleash the power of
Agile Development
and make catching
Bamboo (Continuous Delivery Tool)Deploy
Deployment projects
automate the tedium
right out of releasing
into each environment,
while letting you
control the flow with
per-environment
permissions.
Connect
Bamboo boasts the
best integration
with JIRA
Software, Bitbucket, Fis
heye, and HipChat.
Also, boost your CI
pipeline by choosing
from more than a
hundred fifty add-ons
in our
Marketplace or make
your own
Bamboo Server vs. Jenkins
JIRA (Development Tool)
Plan
Create user stories and
issues, plan sprints, and
distribute tasks across
your software team.
Track
Prioritize and discuss
your team's work in full
context with complete
visibility.
JIRA (Development Tool)
Release
Ship with confidence
and sanity knowing the
information you have is
always current.
Report
Improve team
performance based on
real-time, visual data
you can use.
Azure from scratch part 3 By Girish Kalamati

Azure from scratch part 3 By Girish Kalamati

  • 1.
  • 2.
    Intro & Settingup cloud mind-Set girishkrao.portfoliobox.net theazureguys.wordpress.com twitter.com/TheAzureGuy007 facebook.com/TheAzureGuy007 https://github.com/TheAzureGuy007 https://www.linkedin.com/in/girish-kalamati-357a6398/ https://www.youtube.com/channel/UCd9z6-2mZdqjRnAHh3W_9Uw
  • 3.
    Azure Functions(Azure) orLamda Functions (AWS) Traditional Technical Architecture Serverless Architecture
  • 10.
  • 18.
    Think of aReal World Scenario where we can post “Build Pipeline event” info to “SLACK”
  • 19.
  • 22.
    Azure Functions :Github Webhook Trrigers functions whenever comment is issued Azure Functions : Store unstructured data using Azure Functions and Cosmos DBDemo
  • 23.
    Get function URL:https://myfunctionappp.azurewebsites.net/api/Github-Webhook-JS?clientId=default GitHub Secret : M4ZU1EmyWrZ/jimZkg*****************************ZJR9jg== Azure Functions : Webhook + GITHub https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-github-webhook-triggered-function Learn how to create a function that is triggered by an HTTP webhook request with a GitHub-specific payload
  • 24.
    Add the detailsin your own GitHub Repo : https://github.com/TheAzureGuy007/ The-Azure-Guy-Repo/ settings/hooks/new
  • 25.
    Azure Functions :Webhook + GITHub
  • 26.
    Azure Functions :Webhook + GITHub
  • 27.
    Azure Cosmos DB Globallydistributed mission-critical applications Guarantee access to users around the world with the high-availability and low-latency capabilities built into Microsoft’s global datacenters.
  • 28.
    Azure Cosmos DB IoT Instantly,elastically scale to accommodate diverse and unpredictable IoT workloads without sacrificing ingestion or query performance.
  • 29.
    Store unstructured datausing Azure Functions and Cosmos DB
  • 30.
  • 31.
    Check the datawould have pushed
  • 32.
    Check the datawould have pushed
  • 33.
    Check the datawould have pushed
  • 34.
    App Service PlanDefinition
  • 36.
    Azure PaaS andAzure Stack
  • 37.
    Application Platform Offerings Therethree different methodologies you have available for running your applications on the Microsoft platform, and they each give different levels of control and autonomy, yet still provide the flexibility, availability and cost savings that are associated with PaaS workloads. •Azure App Service: allows you leverage the benefits of a PaaS solution •App Service Environment: allows you to leverage Azure App Services in a more isolated and salable environment •Azure Stack: allows you have Azure in your data center and use the same controls and architecture as Azure while still maintaining your on-premises control. It can help you prepare move more seamlessly into Azure if you wish to do so as they are the same environment, one in your data center and one in the cloud.
  • 38.
    Managing Azure AppsServices Management Tools There are a variety of tools you can use to interact and manage Azure App Services, Below are the main ones that are typically available throughout Azure that are also applicable with Azure App Services •Azure PowerShell: set of modules providing PowerShell cmdlets. Can be run on Windows, Linux or MacOS •Azure Command Line Interface (CLI):open source command line shell. Can be run on Windows, Linux of MacOS •REST APIs: REST based APIs for Azure resource manager (ARM) •Templates: Resource Manager templates, to define resource objects and automate deployment and configuration •Azure Portal: • New Portal: portal.azure.com • Classic Portal: manage.windowsazure.com
  • 39.
  • 40.
    Managing Azure AppsServices U can run Curl (File transfer cmd’s) or Git commdands
  • 41.
    Locking Resources Creating aLock In Azure App Services It is possible lock a subscription, resource group, or resource such as your web application, to prevent other users from deleting or modifying it. You can set the lock level to •CanNotDelete: means authorized users can still read and modify a resource, but they can't delete it. •ReadOnly: means authorized users can read from a resource, but they can't delete it or perform any actions on it. The permission on the resource is restricted to the Reader role. Locks differ from Role Based Access control in that locks apply to all users and roles.
  • 42.
    Configure Custom DomainName It is possible to create your own custom domain name and use it with your Web App hosted on Azure App Services. You can purchase a domain name directly through the Azure App services portal or you can bring your own
  • 43.
    Adding Site Extensions Itspossible to add extensions to your Web App. This can extend functionality and ease management overhead and help in a number of different ways depending on your own requirements. There are a lot of extensions available, calling out just a few here which help extend functionality and improve monitoring capabilities Application insights: provides monitoring capabilities New Relic: provides monitoring capabilities, more details on the project page Php Manager: tool for managing php installations. More details are available on the PHP Manager Gitub page Jekyll: Adds support for Jekyll on a Web App. More details are available on the Jekyll Site Extension Gitub page
  • 44.
    App Service DeploymentOptions App Service Deployment Options There are a number of different options available for deployment Basic: • FTP • Web Deploy Alternative: • OneDrive/DropBox • Kudu Source Control / Continuous Deployment • Visual Studio Online • Local Git • GitHub • BitBucket You can use various tools as part of these such as powerShell, Azure CLI
  • 45.
    Data Services inAzure Overview There are a number of different options to meet your needs for Data services in Azure. • SQL Databases: Based on SQL sever, provides relational database server with scale, performance and availability, as well as integration with existing on-premises SQL Server workloads for hybrid implementations. • SQL data Warehouse: A combination of SQL server relational db with Azure cloud scale-out capabilities. Suitable for enterprise, large scale workloads. • Document DB: A schema free NoSQL database service, highly scalable and available. • Table Storage: Stores structured NoSQL data with no schema. lower cost option than SQL. Could be suitable for user data for web apps, address books, device information etc. • Redis Cache: Provides access to a Redis cache, accessible by any application within Azure, providing high throughput and low-latency for application requiring speed and scale. • Data Factory: Manages movement and integration of data. Assists integrating different sources and different types of data. • Data Lake: A collection of services that allows for the storing, managing and analysis of large amounts of data, getting the most out of the data you have
  • 47.
    Azure SQL (PaaS)Vs SQL Server (IaaS)
  • 48.
    Azure SQL Database(PaaS) Vs SQL Server in a VM (IaaS)
  • 50.
    Service Tiers There arethree different Service Tiers to accommodate various workload requirements. All provide an up-time SLA of 99.99% and hourly billing. The service tiers are • Basic: Suitable for small databases, and low volume needs • Standard: Suitable for most cloud based apps • Premium: Suitable for high transnational volumes with super critical workloads Within each of these top level tiers, there are various performance levels available. It is possible to change service tiers and performance levels dynamically
  • 51.
    What is aDTU? A Data Transnational Unit (DTU) is a measure of the resources that are guaranteed to be available to a standalone • Azure SQL database at a specific performance level within a service Tier. • It is a measure that combines CPU, memory and I/O values. • The larger the number the better the performance, but this unit of measure provides a way for you to see what your overall performance levels are and what your needs are, then being able to relate that to cost. An elastic DTU (eDTU) is a measure of the resources across a set of databases, called an elastic pool.
  • 54.
    The general stepsyou should follow are • Test for compatibility: validate the database compatibility • Fix Compatibility issues if found • Perform the migration There are a number of options available to help with the process of migration depending on whether you can afford some down time or not. If you need minimal down time you can use SQL Server transactional replication replicate your data. If you can accept some down time, some of which are • Use the built in Deploy Database to Microsoft Azure Database Wizard • Export to DAC package and ImportDAC package in Azure SQL • If you just want the schema you can generate a script for entire database schema using Transact SQL Migrating a SQL database
  • 55.
    Method 1: Migrationwith downtime 1.Assess the database for any compatibility issues using the latest version of Data Migration Assistant (DMA). 2.Prepare any necessary fixes as Transact-SQL scripts. 3.Make a transactionally consistent copy of the source database being migrated - and ensure no further changes are being made to the source database (or you can manually apply any such changes after the migration completes). There are many methods to quiesce a database, from disabling client connectivity to creating a database snapshot. 4.Deploy the Transact-SQL scripts to apply the fixes to the database copy. 5.Export the database copy to a .BACPAC file on a local drive. 6.Import the .BACPAC file as a new Azure SQL database using any of several BACPAC import tools, with SQLPackage.exe being the recommended tool for best performance.
  • 56.
    The following listcontains recommendations for best performance during the import process. • Choose the highest service level and performance tier that your budget allows to maximize the transfer performance. You can scale down after the migration completes to save money. • Minimize the distance between your .BACPAC file and the destination data center. • Disable auto-statistics during migration • Partition tables and indexes • Drop indexed views, and recreate them once finished • Remove rarely queried historical data to another database and migrate this historical data to a separate Azure SQL database. You can then query this historical data using elastic queries. Optimizing data transfer performance during migration
  • 57.
    Method 2: UseTransactional Replication 1.Set up Distribution 1. Using SQL Server Management Studio (SSMS) 2. Using Transact-SQL 2.Create Publication 1. Using SQL Server Management Studio (SSMS) 2. Using Transact-SQL 3.Create Subscription 1. Using SQL Server Management Studio (SSMS) 2. Using Transact-SQL
  • 61.
    Data Lake What isData Lake? Data Lake is a batch, real-time, interactive data analysis tool. Data Lake makes it easy for developers, data scientists, and analysts to store data of any size, shape and speed, and do all types of processing and analytics across platforms and languages. Azure Data Lake is a family of Azure services that enables you to analyze your big data workloads in a managed manner. It consists of these services: • Azure Data Lake Store - A data repository that enables you to store any type of data in its raw format without defining schema. The store offers unlimited storage with immediate read/write access to it and scaling the throughput you need for your workloads. The store is Hadoop Data File System (HDFS) compatible so you can use your existing tools. • Azure Data Lake Analytics - An analytics service that allows you to run analysis jobs on data. Analytics using Apache YARN to manage its resources for the processing engine. By using the U-SQL query language you can process data from several data sources such as Azure Data Lake Store, Azure Blob Storage, Azure SQL Database but also from other data stores built on HDFS. • Azure Data Lake HDInsight - An analytics service that enables you to analyze data sets on a managed cluster running open-source technologies such as Hadoop, Spark, Storm & HBase.
  • 62.
  • 63.
    Write your ownAzure ARM Templates 1. Create a NEW ARM Template 2. Let’s Deploy Full PAAS solution 3. Populate website from GitHub Source
  • 64.
  • 65.
  • 66.
    Docker’s & Container’s Hypervisor~ Container’s VMWare ~ Docker’s
  • 90.
    Running Docker Machine/ClientCommands in MAC OSX
  • 91.
    bash --login '/Applications/Docker/DockerQuickstart Terminal.app/Contents/Resources/Scripts/start.sh' Last login: Fri Sep 29 22:12:55 on ttys000 Girishs-Mac:~ girishkalamati$ bash --login '/Applications/Docker/Docker Quickstart Terminal.app/Contents/Resources/Scripts/start.sh' ## . ## ## ## == ## ## ## ## ## === /"""""""""""""""""___/ === ~~~ {~~ ~~~~ ~~~ ~~~~ ~~~ ~ / ===- ~~~ ______ o __/ __/ ___________/ docker is configured to use the default machine with IP 192.168.99.100 For help getting started, check out the docs at https://docs.docker.com Girishs-Mac:~ girishkalamati$ docker-machine ls Running Docker Machine Commands in QuickStart Terminal
  • 92.
    Girishs-Mac:~ girishkalamati$ dockerps Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running? Girishs-Mac:~ girishkalamati$ docker-machine env default export DOCKER_TLS_VERIFY="1" export DOCKER_HOST="tcp://192.168.99.100:2376" export DOCKER_CERT_PATH="/Users/girishkalamati/.docker/m achine/machines/default" export DOCKER_MACHINE_NAME="default" # Run this command to configure your shell: # eval $(docker-machine env default) Girishs-Mac:~ girishkalamati$ eval $(docker-machine env default) Girishs-Mac:~ girishkalamati$ docker ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Girishs-Mac:~ girishkalamati$ Running Docker Machine Commands in Normal Bash
  • 95.
    Girishs-Mac:~ girishkalamati$ dockerpull hello-world Using default tag: latest latest: Pulling from library/hello-world Digest: sha256:b2ba691d8aac9e5ac3644c0788e3d3823f9e97f75 7f01d2ddc6eb5458df9d801 Status: Image is up to date for hello-world:latest Girishs-Mac:~ girishkalamati$ docker images REPOSITORY TAG IMAGE ID CREATED SIZE hello-world latest 05a3bd381fc2 2 weeks ago 1.84kB Girishs-Mac:~ girishkalamati$ docker run hello-world Hello from Docker! This message shows that your installation appears to be working correctly. Running Docker Client Commands
  • 98.
    docker is configuredto use the default machine with IP 192.168.99.100 For help getting started, check out the docs at https://docs.docker.com Girishs-Mac:~ girishkalamati$ docker ps -a CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES Girishs-Mac:~ girishkalamati$ docker images REPOSITORY TAG IMAGE ID CREATED SIZE hello-world latest 05a3bd381fc2 2 weeks ago 1.84kB Girishs-Mac:~ girishkalamati$ docker rmi 05a3 Untagged: hello-world:latest Untagged: hello- world@sha256:b2ba691d8aac9e5ac3644c0788e3d3823f 9e97f757f01d2ddc6eb5458df9d801 Deleted: sha256:05a3bd381fc2470695a35f230afefd7bf978b56625 3199c4ae5cc96fafa29b37 Deleted:
  • 100.
    Girishs-Mac:~ girishkalamati$ dockerps Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running? Girishs-Mac:~ girishkalamati$ docker-machine env default export DOCKER_TLS_VERIFY="1" export DOCKER_HOST="tcp://192.168.99.100:2376" export DOCKER_CERT_PATH="/Users/girishkalamati/.docker/m achine/machines/default" export DOCKER_MACHINE_NAME="default" # Run this command to configure your shell: # eval $(docker-machine env default) Girishs-Mac:~ girishkalamati$ eval $(docker-machine env default) Girishs-Mac:~ girishkalamati$ docker ps CONTAINER ID IMAGE COMMAND CREAT ED STATUS PORTS NAMES 0f8a3ec0d161 kitematic/hello-world-nginx "sh /start.sh" 16 minutes ago Up 16 minutes 0.0.0.0:80->80/tcp trusting_swanson Running Docker Client Commands
  • 116.
    Girishs-Mac:~ girishkalamati$ docker-machinels NAME ACTIVE DRIVER STATE URL SWAR M DOCKER ERRORS default * virtualbox Running tcp://192.168.99.100:2376 v17.09.0-ce Girishs-Mac:~ girishkalamati$ dsenableroot username = girishkalamati user password: root password: verify root password: dsenableroot:: ***Successfully enabled root user. Girishs-Mac:~ girishkalamati$ npm install express express- generator -g /usr/local/lib └── express@4.16.1 npm ERR! Darwin 15.6.0 npm ERR! argv "/usr/local/bin/node" "/usr/local/bin/npm" "install" "express" "express-generator" "-g" npm ERR! node v6.11.3 npm ERR! npm v3.10.10 npm ERR! path ../lib/node_modules/express- generator/bin/express-cli.js
  • 117.
    Running Docker Machine/ClientCommands in Windows
  • 120.
    Girishs-Mac:~ girishkalamati$ dockerps Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running? Girishs-Mac:~ girishkalamati$ docker-machine env default export DOCKER_TLS_VERIFY="1" export DOCKER_HOST="tcp://192.168.99.100:2376" export DOCKER_CERT_PATH="/Users/girishkalamati/.docker/m achine/machines/default" export DOCKER_MACHINE_NAME="default" # Run this command to configure your shell: # eval $(docker-machine env default) Girishs-Mac:~ girishkalamati$ eval $(docker-machine env default) Girishs-Mac:~ girishkalamati$ docker ps CONTAINER ID IMAGE COMMAND CREAT ED STATUS PORTS NAMES 0f8a3ec0d161 kitematic/hello-world-nginx "sh /start.sh" 16 minutes ago Up 16 minutes 0.0.0.0:80->80/tcp trusting_swanson Running Docker Client Commands
  • 121.
    PS C:UsersGirish> dockerps error during connect: Get http://%2F%2F.%2Fpipe%2Fdocker_engine/v1.31/containers/js on: open //./pipe/docker_engine: The system cannot find the file specified. In the default daemon configuration on Windows, the docker client must be run ele vated to connect. This error may also indicate that the docker daemon is not running. PS C:UsersGirish> docker-machine env default $Env:DOCKER_TLS_VERIFY = "1" $Env:DOCKER_HOST = "tcp://192.168.99.100:2376" $Env:DOCKER_CERT_PATH = "C:UsersGirish.dockermachinemachinesdefault" $Env:DOCKER_MACHINE_NAME = "default" $Env:COMPOSE_CONVERT_WINDOWS_PATHS = "true" # Run this command to configure your shell: # & "C:Program FilesDocker Toolboxdocker-machine.exe" env default | Invoke-Expression PS C:UsersGirish> & "C:Program FilesDocker Toolboxdocker-machine.exe" env default | Invoke-Expression PS C:UsersGirish> docker ps CONTAINER ID IMAGE COMMAND CREATED STATU S PORTS
  • 122.
    PS C:UsersGirish> dockerpull kitematic/hello-world-nginx Using default tag: latest latest: Pulling from kitematic/hello-world-nginx 77c6c00e8b61: Pull complete 9b55a9cb10b3: Pull complete e6cdd97ba74d: Pull complete 7fecf1e9de6b: Pull complete 6b75f22d7bea: Pull complete e8e00fb8479f: Pull complete 69fad424364c: Pull complete b3ba6e76b671: Pull complete a956773dd508: Pull complete 26d2b0603932: Pull complete 3cdbb221209e: Pull complete a3ed95caeb02: Pull complete Digest: sha256:ec0ca6dcb034916784c988b4f2432716e2e92b995ac60 6e080c7a54b52b87066 Status: Downloaded newer image for kitematic/hello-world- nginx:latest
  • 124.
  • 125.
    Old Days whenwe use to run app’s on Server’s
  • 126.
  • 127.
    Even VMWare hadloopholes
  • 128.
  • 129.
  • 130.
  • 131.
    Docker Hub orDocker Store https://store.docker.com/
  • 132.
    Searching MongoDB Containerin Docker Hub or Store
  • 133.
  • 134.
  • 135.
    Publishing back thecustomized container
  • 136.
    Publishing back thecustomized container
  • 137.
    Publishing back thecustomized container
  • 138.
    How Dockers canbe useful for us ?
  • 139.
    Full CI/CD pipelineto deploy a multi-container application on Azure Container Service with Docker Swarm using Visual Studio Team Services
  • 155.
    AT first createa VSTS account in case you do not have one
  • 156.
    Authorize Via UIor PAT Token
  • 157.
    Add Build stepin build workflow You need to add two Docker steps for each image, one to build the image, and one to push the image in the Azure container registry
  • 158.
    Bamboo (Continuous DeliveryTool)Build Focus on coding and count on Bamboo as your CI and build server! Create multi- stage build plans, set up triggers to start builds upon commits, and assign agents to your critical builds and deployments. Test Testing is a key part of continuous integration. Run automated tests in Bamboo to regress your products thoroughly with each change. Parallel automated tests unleash the power of Agile Development and make catching
  • 159.
    Bamboo (Continuous DeliveryTool)Deploy Deployment projects automate the tedium right out of releasing into each environment, while letting you control the flow with per-environment permissions. Connect Bamboo boasts the best integration with JIRA Software, Bitbucket, Fis heye, and HipChat. Also, boost your CI pipeline by choosing from more than a hundred fifty add-ons in our Marketplace or make your own
  • 160.
  • 161.
    JIRA (Development Tool) Plan Createuser stories and issues, plan sprints, and distribute tasks across your software team. Track Prioritize and discuss your team's work in full context with complete visibility.
  • 162.
    JIRA (Development Tool) Release Shipwith confidence and sanity knowing the information you have is always current. Report Improve team performance based on real-time, visual data you can use.

Editor's Notes

  • #92 Running Docker Codes in QuickStart Terminal
  • #93 Running Docker Codes in Bash Terminal
  • #96 Running Docker Codes in Bash Terminal
  • #109 Docker run –p 8080:3000 -v /var/www node Node == image name It will create a container inside host … even it says Container volume 
  • #111 Custom folder to put source code in HOST