In this new presentation, we will cover advanced Terraform topics (full-on DevOps). We will compare the deployment of Terraform using Azure DevOps, GitHub/GitHub Actions, and Terraform Cloud. We wrap everything up with some key takeaway learning resources in your Terraform learning adventure.
NOTE: A recording of this presenting is available here: https://www.youtube.com/watch?v=fJ8_ZbOIdto&t=5574s
An inroduction to Terraform, a tool that helps you deploy and change your infrastructure as code. Given at Rencontres Mondiales du Logiciel libre (RMLL) 2017
Terraform modules and best-practices - September 2018Anton Babenko
Â
Slides for my "Terraform modules and best-practices" talk on meetups during September 2018.
Some links from the slides:
https://www.terraform-best-practices.com/
https://cloudcraft.co/
https://github.com/terraform-aws-modules/
https://github.com/antonbabenko/modules.tf-lambda
An inroduction to Terraform, a tool that helps you deploy and change your infrastructure as code. Given at Rencontres Mondiales du Logiciel libre (RMLL) 2017
Terraform modules and best-practices - September 2018Anton Babenko
Â
Slides for my "Terraform modules and best-practices" talk on meetups during September 2018.
Some links from the slides:
https://www.terraform-best-practices.com/
https://cloudcraft.co/
https://github.com/terraform-aws-modules/
https://github.com/antonbabenko/modules.tf-lambda
This beginning terraform workshop will teach you how to safely create and provision Infrastructure as Code (IAC) using Hashicorp Terraform in an AWS environment. In this class you will learn how to setup and install terraform. You will also be given a walkthrough of Terraform fundamentals. You will be lead through the process of deploying a single server, deploying a cluster and setting up a load balancer. You will also learn how to author Terraform Modules, work with Route53 and how to manage DNS.
Requirements. You will need to have an AWS account set up already with Terraform v0.9.3 installed. You will also need to have git install to download the workshop material.
You can find more informaiton on how to install terraform here: https://www.terraform.io/intro/getting-started/install.html. You can sign up for an AWS account here: https://aws.amazon.com/account/
https://github.com/jasonvance/terraform-introduction
As part of this presentation we covered basics of Terraform which is Infrastructure as code. It will helps to Devops teams to start with Terraform.
This document will be helpful for the development who wants to understand infrastructure as code concepts and if they want to understand the usability of terrform
Infrastructure-as-Code (IaC) using TerraformAdin Ermie
Â
Learn the benefits of Infrastructure-as-Code (IaC), what Terraform is and why people love it, along with a breakdown of the basics (including live demo deployments). Then wrap up with a comparison of Azure Resource Manager (ARM) templates versus Terraform, consider some best practices, and walk away with some key resources in your Terraform learning adventure.
Are you looking to automate your infrastructure but not sure where to start? View this presentation on âGetting started with Infrastructure as codeâ to learn how to leverage IaC to deploy and manage resources on Azure. You will learn:
⢠Introduction to IaC
⢠Develop a simple IaC using Terraform
⢠Manage the deployed infrastructure using Terraform
View webinar recording at https://www.winwire.com/webinars
Best Practices of Infrastructure as Code with TerraformDevOps.com
Â
When your organization is moving to cloud, the infrastructure layer transitions from running dedicated servers at limited scale to a dynamic environment, where you can easily adjust to growing demand by spinning up thousands of servers and scaling them down when not in use.
The future of DevOps is infrastructure as code. Infrastructure as code supports the growth of infrastructure and provisioning requests. It treats infrastructure as software: code that can be re-used, tested, automated and version controlled. HashiCorp Terraform adopts infrastructure as code throughout its tool to prevent configuration drift, manage immutable infrastructure and much more!
Join this webinar to learn why Infrastructure as Code is the answer to managing large scale, distributed systems and service-oriented architectures. We will cover key use cases, a demo of how to use Infrastructure as Code to provision your infrastructure and more:
Agenda:
Intro to Infrastructure as Code: Challenges & Use cases
Writing Infrastructure as Code with Terraform
Collaborating with Teams on Infrastructure
A comprehensive walkthrough of how to manage infrastructure-as-code using Terraform. This presentation includes an introduction to Terraform, a discussion of how to manage Terraform state, how to use Terraform modules, an overview of best practices (e.g. isolation, versioning, loops, if-statements), and a list of gotchas to look out for.
For a written and more in-depth version of this presentation, check out the "Comprehensive Guide to Terraform" blog post series: https://blog.gruntwork.io/a-comprehensive-guide-to-terraform-b3d32832baca
Infrastructure-as-Code (IaC) Using Terraform (Intermediate Edition)Adin Ermie
Â
In this presentation, we will cover intermediate Terraform topics including alternative providers, collection types, loops and conditionals, and resource lifecycles. We will also focus on reusability with a discussion on modules, data sources, and remote state (including live demo examples).
Finally, we start the initial look into a full DevOps process with a quick review of Workspaces and Terraform Cloud; and wrap everything up with some key takeaway learning resources in your Terraform learning adventure.
NOTE: A recording this presentation can be found here: https://youtu.be/0CEF4eZ6HiQ
Using HashiCorpâs Terraform to build your infrastructure on AWS - Pop-up Loft...Amazon Web Services
Â
Using Terraform to automate your infrastructure on AWS. What is Terraform and how is it different from Ansible. How to control cloud deployments using Terraform.
This presentation covers deploy Azure DevOps projects, repositories, pipelines, variable groups, etc. using the newly released Azure DevOps Terraform provider.
A recording of this presentation is available on my YouTube channel here: https://www.youtube.com/c/adinermie
A blog article about this topic is also available here: https://adinermie.com/deploying-azure-devops-ado-using-terraform/
This beginning terraform workshop will teach you how to safely create and provision Infrastructure as Code (IAC) using Hashicorp Terraform in an AWS environment. In this class you will learn how to setup and install terraform. You will also be given a walkthrough of Terraform fundamentals. You will be lead through the process of deploying a single server, deploying a cluster and setting up a load balancer. You will also learn how to author Terraform Modules, work with Route53 and how to manage DNS.
Requirements. You will need to have an AWS account set up already with Terraform v0.9.3 installed. You will also need to have git install to download the workshop material.
You can find more informaiton on how to install terraform here: https://www.terraform.io/intro/getting-started/install.html. You can sign up for an AWS account here: https://aws.amazon.com/account/
https://github.com/jasonvance/terraform-introduction
As part of this presentation we covered basics of Terraform which is Infrastructure as code. It will helps to Devops teams to start with Terraform.
This document will be helpful for the development who wants to understand infrastructure as code concepts and if they want to understand the usability of terrform
Infrastructure-as-Code (IaC) using TerraformAdin Ermie
Â
Learn the benefits of Infrastructure-as-Code (IaC), what Terraform is and why people love it, along with a breakdown of the basics (including live demo deployments). Then wrap up with a comparison of Azure Resource Manager (ARM) templates versus Terraform, consider some best practices, and walk away with some key resources in your Terraform learning adventure.
Are you looking to automate your infrastructure but not sure where to start? View this presentation on âGetting started with Infrastructure as codeâ to learn how to leverage IaC to deploy and manage resources on Azure. You will learn:
⢠Introduction to IaC
⢠Develop a simple IaC using Terraform
⢠Manage the deployed infrastructure using Terraform
View webinar recording at https://www.winwire.com/webinars
Best Practices of Infrastructure as Code with TerraformDevOps.com
Â
When your organization is moving to cloud, the infrastructure layer transitions from running dedicated servers at limited scale to a dynamic environment, where you can easily adjust to growing demand by spinning up thousands of servers and scaling them down when not in use.
The future of DevOps is infrastructure as code. Infrastructure as code supports the growth of infrastructure and provisioning requests. It treats infrastructure as software: code that can be re-used, tested, automated and version controlled. HashiCorp Terraform adopts infrastructure as code throughout its tool to prevent configuration drift, manage immutable infrastructure and much more!
Join this webinar to learn why Infrastructure as Code is the answer to managing large scale, distributed systems and service-oriented architectures. We will cover key use cases, a demo of how to use Infrastructure as Code to provision your infrastructure and more:
Agenda:
Intro to Infrastructure as Code: Challenges & Use cases
Writing Infrastructure as Code with Terraform
Collaborating with Teams on Infrastructure
A comprehensive walkthrough of how to manage infrastructure-as-code using Terraform. This presentation includes an introduction to Terraform, a discussion of how to manage Terraform state, how to use Terraform modules, an overview of best practices (e.g. isolation, versioning, loops, if-statements), and a list of gotchas to look out for.
For a written and more in-depth version of this presentation, check out the "Comprehensive Guide to Terraform" blog post series: https://blog.gruntwork.io/a-comprehensive-guide-to-terraform-b3d32832baca
Infrastructure-as-Code (IaC) Using Terraform (Intermediate Edition)Adin Ermie
Â
In this presentation, we will cover intermediate Terraform topics including alternative providers, collection types, loops and conditionals, and resource lifecycles. We will also focus on reusability with a discussion on modules, data sources, and remote state (including live demo examples).
Finally, we start the initial look into a full DevOps process with a quick review of Workspaces and Terraform Cloud; and wrap everything up with some key takeaway learning resources in your Terraform learning adventure.
NOTE: A recording this presentation can be found here: https://youtu.be/0CEF4eZ6HiQ
Using HashiCorpâs Terraform to build your infrastructure on AWS - Pop-up Loft...Amazon Web Services
Â
Using Terraform to automate your infrastructure on AWS. What is Terraform and how is it different from Ansible. How to control cloud deployments using Terraform.
This presentation covers deploy Azure DevOps projects, repositories, pipelines, variable groups, etc. using the newly released Azure DevOps Terraform provider.
A recording of this presentation is available on my YouTube channel here: https://www.youtube.com/c/adinermie
A blog article about this topic is also available here: https://adinermie.com/deploying-azure-devops-ado-using-terraform/
How Many Ohs? (An Integration Guide to Apex & Triple-o)OPNFV
Â
Dan Radez, Red Hat, Tim Rozet, Red Hat
The OPNFV ecosystem is made up of projects that need to integrate with each other. Project Apex uses Triple-o under the covers which most people usually need some assistance to integrate with.
Come and spend a session with the Apex development team learning the ins and outs of Triple-o.
In this session participants will learn about the deployment process that is run when an Apex/Triple-o deployment is executed and how to assign services to nodes and generate networking configurations withing Triple-o to successfully integrate and deploy a new component in OpenStack.
Come learn how to untangle the learning curve presented when integrating and using Triple-o and simplify your future development and deployment endeavors with a new found intimate knowledge of the Apex & Triple-o platform.
The why, what and how to leverage Terraform to manage Cloud resources safely.
Experience feedback from adoption by Leboncoin DataEngineering team.
In these slides you will find introduction material for beginners and advanced use cases you will quickly be facing when working within a team and with enterprise constraints.
Terraform infraestructura como coĚdigoVictor Adsuar
Â
PresentaciĂłn empleada en el primer MeetUp AWS del grupo de usuarios de Valencia.
Infraestructura como cĂłdigo empleando Terraform. Se muestra las principales caracterĂsticas de esta tecnologĂa que nos permite ser mĂĄs ĂĄgiles y rĂĄpidos desplegando nuestras plataformas en AWS.
A Hands-on Introduction on Terraform Best Concepts and Best Practices Nebulaworks
Â
At our OC DevOps Meetup, we invited Rami Al-Ghami, a Sr. Software engineer at Workday to deliver a presentation on a Hands-On Terraform Best Concepts and Best Practices.
The software lifecycle does not end when the developer packages their code and makes it ready for deployment. The delivery of this code is an integral part of shipping a product. Infrastructure orchestration and resource configuration should follow a similar lifecycle (and process) to that of the software delivered on it. In this talk, Rami will discuss how to use Terraform to automate your infrastructure and software delivery.
https://www.youtube.com/watch?v=IeweKUdHJc4
My presentation from Hashiconf 2017, discussing our use of Terraform, and our techniques
to help make it safe and accessible.
Dev Dives: Train smarter, not harder â active learning and UiPath LLMs for do...UiPathCommunity
Â
đĽ Speed, accuracy, and scaling â discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Miningâ˘:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing â with little to no training required
Get an exclusive demo of the new family of UiPath LLMs â GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
đ¨âđŤ Andras Palfi, Senior Product Manager, UiPath
đŠâđŤ Lenka Dulovicova, Product Program Manager, UiPath
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
Â
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
Â
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
⢠The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
⢠Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
⢠Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
⢠Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
Â
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more âmechanicalâ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Â
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Â
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overviewâ
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Â
Are you looking to streamline your workflows and boost your projectsâ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, youâre in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part âEssentials of Automationâ series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Hereâs what youâll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
Weâll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Donât miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
Â
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
3. Microsoftâs investments in Terraform
⢠Microsoft Team HashiCorp Team
⢠Terraform AzureRM Provider updates
⢠Latest release v2.18.0 (July 10, 2020)
⢠23 features added (new data sources, resources)
⢠27 enhancements
⢠6 bug fixes
⢠4x releases/updates published in June alone!
⢠Terraform Module Registry
⢠https://registry.terraform.io/browse/modules?provider=azurerm
5. Terraform v0.13 highlights
ď§Support for , ,
and
ď§New syntax
ď§Custom
command connects a CLI user to the Terraform Cloud app
variable "image_id" {
type = string
description = "The id of the machine image (AMI) to use for the server."
validation {
condition = length(var.image_id) > 4 && substr(var.image_id, 0, 4) == "ami-"
error_message = "The image_id value must be a valid AMI id, starting with "ami-"."
terraform {
required_providers {
azurerm = {
source = "hashicorp/azurerm"
version = "2.0.0"
}
}
}
8. Hot off the press!
⢠Announcing the Azure DevOps Provider for
Terraform
⢠https://cloudblogs.microsoft.com/opensource/2020/06/18/announ
cing-hashicorp-terraform-azure-devops-provider-release/
9. Configuration
ď§ Using Azure DevOps Repo vs GitHub
ď§ Yes, you can use GitHub, but I was going for the âfullâ Azure DevOps
(ADO) experience
ď§ Integrated with Azure KeyVault (for SPN credentials)
ď§ Via Variable Groups in the Pipelineâs Library
ď§ Multiple pipelines created
ď§ Deploy pipelines (hub/spoke/VNet peering)
ď§ Cleanup pipelines (hub/spoke/VNet peering)
16. Configuration
⢠Using GitHub repo
⢠Leverage GitHub Secrets (for SPN credentials, SAS keys, API
tokens)
⢠Multiple workflows (aka pipelines) created
⢠Deploy workflows (hub/spoke/VNet peering)
⢠Cleanup workflows (hub/spoke/VNet peering)
17. GitHub Actions (aka pipelines)
⢠A JavaScript action that sets up Terraform CLI in
your GitHub Actions workflow by:
⢠Downloading a specific version of Terraform
CLI and adding it to the PATH
⢠Configuring the Terraform CLI configuration
file with a Terraform Cloud/Enterprise
hostname and API token
⢠Installing a wrapper script to wrap
subsequent calls of the terraform binary and
expose its STDOUT, STDERR, and exit
code
20. Workflow Jobs
jobs:
terraform:
name: 'Terraform'
runs-on: ubuntu-latest
# Use the Bash shell regardless whether the GitHub Actions runner is ubuntu-latest,
macos-latest, or windows-latest
defaults:
run:
shell: bash
working-directory: ./Terraform/Networking/Deployments/Network-Deployment/Hub-
Deploy
21. Workflow Steps
# Install the latest version of Terraform CLI and configure the Terraform CLI configuration file with
a Terraform Cloud user API token
- name: Setup Terraform
uses: hashicorp/setup-terraform@v1
with:
# terraform_version: 0.12.25 You can use this to set the specific version of Terraform to target.
cli_config_credentials_token: ${{ secrets.TF_API_TOKEN }}
# Initialize a new or existing Terraform working directory by creating initial files, loading any
remote state, downloading modules, etc.
- name: Terraform Init
run: terraform init
# Generates an execution plan for Terraform
- name: Terraform Plan
run: terraform plan -var-file='Hub.tfvars' -out HubDeploy.plan
# - name: Terraform Apply
# if: github.ref == 'refs/heads/master' && github.event_name == 'push'
# run: terraform apply -auto-approve
22. What you canât do
⢠Use modules with a relative path!
⢠Known issue #23333
⢠Specifically when using Terraform Cloud as
the remote backend
⢠Trigger another Action/Workflow after a
workflow is completed (ie. chaining)
⢠Manually trigger an Action/Workflow
⢠Not apparent you can use an alternative
backend (ie. Azure Storage) when using
the built-in Terraform GitHub Action
module "vnets-SharedServices"
{
source = "../../../Hub/"
âŚ
}
25. Configuration
⢠Using GitHub repo
⢠Leverage Terraform variables (for SPN credentials)
⢠Multiple workspaces created (1 workspace = 1 state)
⢠Deploy workspace (hub/spoke/VNet peering)
⢠Note: âcleanupâ workspaces not required, as the destruction and deletion
process is built into the existing one
26. TF Cloud Workspaces (aka pipelines)
⢠How Terraform Cloud organizes infrastructure
⢠Terraform Cloud manages infrastructure collections with workspaces
instead of directories
⢠Contains configuration, state data, variables, etc.
⢠Functions like a completely separate working directory
⢠Each workspace retains backups of its previous state files
⢠Retains a record of all run activity
⢠Summaries, logs, a reference to the changes that caused the run, and user
comments
28. Workspace Variables
⢠Terraform vs Environment variables
⢠terraform.tfvars did not work for me
⢠Had to use *.auto.tfvars
terraform plan -var=âXâ -var-file=âY.tfvars" -out=âZ.planâ
29. Workspace Runs
⢠Terraform Cloud always performs Terraform runs
in the context of a workspace
⢠The workspace serves the same role that a
persistent working directory serves when
running Terraform locally:
⢠it provides the configuration, state, and
variables for the run
30. Run Triggers
⢠allow runs to queue automatically in
this workspace on successful apply
of runs in any of the source
workspaces
NOTE!
31. Points to remember
⢠You canât have a custom named .tfvars file, unless you use
the *.auto.tfvars naming
⢠Workspace âworking directoryâ controls the root terraform init
location, with no option/method to travers directories
⢠Triggering a delete/destroy, will trigger other chained/linked
workspaces (ie. delete Hub will trigger deploy Spoke)
34. Bonus! TFLint
⢠A part of the GitHub Super Linter
⢠One linter to rule them all
⢠Used to validate against issues
⢠Focused on possible errors, , etc.
⢠Support for all providers
⢠Rules that warn against
⢠AWS = 700+ rules
⢠Azure = 279 rules (Experimental support)
⢠GCP = WIP
35. Resources
⢠Adinâs personal curated list of Terraform resources
⢠Advanced Tips & Tricks to Optimize your Terraform Code
⢠Terraform: How to Rename (Instead of Deleting) a Resource
⢠The Ultimate Terraform Workflow: Setup Terraform (and Remote State)
with GitHub Actions
⢠Automating infrastructure deployments in the Cloud with Terraform and
Azure Pipelines
⢠Deploying Terraform Infrastructure using Azure DevOps Pipelines Step
by Step
Donât forget about these Visual
Studio Code (VS Code) extensions:
ď§ Azure Terraform (by Microsoft)
ď§ Terraform (by Mikael Olenfalk)
ď§ Now owned by HashiCorp!
36. More resources
⢠Misadventures with Terraform
⢠Azure DevOps Lab - Terraform using GitHub Actions
⢠Terraform GitHub Actions
⢠Getting Started with Terraform Cloud
⢠How to deploy production-grade infrastructure in a fraction of the time
using Gruntwork with Terraform Cloud and Terraform Enterprise
⢠Using Modules from the Terraform Cloud Private Module Registry
38. This is me
Adin Ermie
⢠Cloud Solution Architect â Azure Apps & Infra @ Microsoft
⢠Azure Infrastructure-as-a-Service (IaaS), Platform-as-a-Service
(PaaS)
⢠Cloud Management & Security
⢠Azure Monitor, Azure Security Center (ASC) / Azure Sentinel
⢠Cloud Governance
⢠Azure Policy, Blueprints, Management Groups, and Azure Cost Management
(ACM)
⢠Business Continuity and Disaster Recovery (BCDR)
⢠Azure Site Recovery (ASR) / Azure Migrate, and Azure Backup
⢠Infrastructure-as-Code (IaC)
⢠Azure Resource Manager (ARM), and Terraform
⢠5x MVP - Cloud and Datacenter Management (CDM)
⢠1x HCA â HashiCorp Ambassador
Adin.Ermie@outlook.com
@AdinErmie
https://AdinErmie.com
linkedin.com/in/adinermie
https://github.com/AErmie
Editor's Notes
There are 2 types of Triggers:
Continuous integration (CI), and
Pull request (PR)
Continuous integration (CI) triggers cause a pipeline to run whenever you push an update to the specified branches or you push specified tags.
You can reference a branch (ie. master), use wildcards (ie. releases/*), use exclude (ie. releases/old), tags (on branches)
Note: You cannot use variables in triggers, as variables are evaluated at runtime (after the trigger has fired).
Note: If you specify an exclude clause without an include clause, then it is equivalent to specifying * in the include clause.
Note: When you specify paths, you must explicitly specify branches to trigger on. You can't trigger a pipeline with only a path filter; you must also have a branch filter, and the changed files that match the path filter must be from a branch that matches the branch filter.
The âTerraform-BuildVariablesâ is the from the Pipeline > Library > Variable Group (which is integrated with Azure KeyVault)
User-defined variables
System variables
Environment variables
System and user-defined variables also get injected as environment variables for your platform. When variables are turned into environment variables, variable names become uppercase, and periods turn into underscores.Â
Note that if you do not include the âinputsâ âterraformVersionâ it will NOT install the latest version, but rather, version 0.12.3!
Notice that weâre passing through the command-line the backend config for using Azure Storage as the remote State store
On the terraform plan command, you can augment it by including a âvar-fileâ reference, and output the plan file
Tasks are versioned, and you must specify the major version of the task used in your pipeline
In YAML, you specify the major version using @ in the task name (ie. TerraformInstaller@0)
I want to kick-off the Spoke pipeline after the Hub pipeline has completed
Notice the âtriggerâ is set to ânoneâ, and we have a âresourcesâ âpipelinesâ code block
pipeline: BLAHÂ specifies the name of the pipeline resource
source: BLAH specifies the name of the triggering pipeline
To date, there are 28 âterraformâ GitHub Actions
There is one official HashiCorp â Setup Terraform action
Workflows are custom automated processes that you can set up in your repository to build, test, package, release, or deploy any code project on GitHub.
With GitHub Actions you can build end-to-end continuous integration (CI) and continuous deployment (CD) capabilities directly in your repository. GitHub Actions powers GitHub's built-in continuous integration service.
The name of the GitHub event that triggers the workflow.Â
You can provide a single event string, array of events, array of event types, or an event configuration map that schedules a workflow or restricts the execution of a workflow to specific files, tags, or branch changes.Â
You can configure a workflow to start once:
An event on GitHub occurs, such as when someone pushes a commit to a repository or when an issue or pull request is created.
A scheduled event begins.An external event occurs.
To trigger a workflow after an event happens on GitHub, add on: and an event value after the workflow name.
Encrypted secrets
Environment variables
GitHub sets default environment variables that are available to every step in a workflow run.
Environment variables are case-sensitive.
A workflow run is made up of one or more jobs. Jobs run in parallel by default. To run jobs sequentially, you can define dependencies on other jobs using the jobs.<job_id>.needs keyword.
Note the âworking-directoryâ and how the path is set (it does not use the double-dot-slash ..\, but rather a single)
A workflow run is made up of one or more jobs. Jobs run in parallel by default. To run jobs sequentially, you can define dependencies on other jobs using the jobs.<job_id>.needs keyword.
Note: There is an error when terraform plan tries to use âvar-fileâ and âoutâ
This may be due to the state pointing to Terraform Cloud vs an Azure Storage Account
This means you cannot use â-outâ to produce a .plan file as an artifact
This also means you cannot pass in a â-var-fileâ, it looks for â*.auto.tfvarsâ instead
A job contains a sequence of tasks called steps.
Not all steps run actions, but all actions run as a step.
Because steps run in their own process, changes to environment variables are not preserved between steps
Trigger an action upon completion of another action: https://github.community/t/trigger-an-action-upon-completion-of-another-action/17642
Triggering a new workflow from another workflow: https://github.community/t/triggering-a-new-workflow-from-another-workflow/16250
At first, I thought I should use the Environment Variables for Subscription ID, Client ID/Secret, and Tenant ID.
But apparently this is not the case, as no value is passed from any key in the âEnvironment Variablesâ
In short, if you want to use it as part of a terraform command-line (ie. terraform plan -var=âXâ -var-file=âY.tfvars" -out=âZ.planâ then you need to use the Terraform Variables
Terraform Cloud workspaces can set values for two kinds of variables:
Terraform input variables, which define the parameters of a Terraform configuration.
Shell environment variables, which many providers can use for credentials and other data.
Terraform Cloud passes variables to Terraform by writing a terraform.tfvars file and passing the -var-file=terraform.tfvars option to the Terraform command.
Terraform runs managed by Terraform Cloud are called remote operations.Â
Remote runs can be initiated by webhooks from your VCS provider, by UI controls within Terraform Cloud, by API calls, or by Terraform CLI.
In a workspace linked to a VCS repo, runs start automatically when you merge or commit changes to version control.
A workspace is linked to one branch of its repository, and ignores changes to other branches. Workspaces can also ignore some changes within their branch: if a Terraform working directory is configured, Terraform Cloud assumes that only some of the content in the repository is relevant to Terraform, and ignores changes outside of that content.
Note that a successful APPLY needs to happen in the source workspace first before it triggers the next one
Note the auto-apply warning!
This means you cannot actually successfully âfullyâ deploy an entire environment in an automated way; human interaction is required!
You can connect your workspace to up to 20 source workspaces.