My notes and ramblings, normally about automation

Infrastructure as code based provisioning can easily grow to the point of chaos. Sentinel, which is HashiCorp’s Policy as Code framework, can easily configure guardrails that are enforced within the provisioning workflow to protect against changes that don’t follow security, regulatory compliance, or internal business policies. Check out how to apply some simple policies to your Azure based Terraform configurations with Terraform Cloud! DevOps Lab - Policy as Code with Terraform and Sentinel

The latest Terraform releases have included lots of features and functionality. Have you integrated all the improvements into your configurations and workflows? Join Kyle and Petros for a dive into some of the hidden features found in Terraform 0.13 and a sneak peek at what is coming up in Terraform 0.14 later this fall! From CLI to Cloud and Back in HashiCorp Terraform For more information about Terraform, see HashiCorp’s site as well as the Learn platform to see how to get started.

HashiCorp’s Terraform Cloud is a SaaS platform where we can more easily collaborate on Terraform configurations through remotely stored, versioned, and shared Terraform state files. Check out how easy it is to get started deploying your Azure resources! DevOps Lab - Remote State Management with Terraform Cloud

Terraform Cloud has added a brand-new tier with some fantastic new features. As part of this video, Michelle Greer and I take a look at Terraform Cloud’s Business tier and how it can be used to help control cloud costs and also adhere to an orgnaization’s policies. Some of the features in use are Terraform Cloud’s Cost Estimation, the Sentinel policy-as-code framework, an integration with Splunk, and more! Check out the video below for all the details.

The Peter Parker Principle dictates, “With great power comes great responsibility.” This can also be applied to the way security is applied to our GitOps processes and DevOps in general. In this video Brad Downey, from GitLab, and I go over the Terraform Cloud and GitLab best practices and features that allow your team to focus more on managing infrastructure, and less on security concerns. Be sure to checkout the demos at the end to see the numerous ways that we can maintain security and compliance without interrupting the pipeline.

Ignite is Microsoft’s annual conference. This year was the first time I had ever been able to attend, but I also had the pleasure to sit on an «Ask the Experts» panel. The panel was comprised of some very notable PowerShell community members, including Bonnie Runimas and Rob Sewell, a couple PowerShell program members, including Sydney Smith and Joey Aiello, and the fantatic moderator team of Aleksandar Nikolic, Esther Barthel, and Jaap Brasser.

I was recently asked to work on some integration pieces between Terraform and Splunk. Luckily, Splunk has a Docker container image that handles all the work of deploying a VM, updating it, installing Splunk, and performing the initial configurations. Docker containers are an incredibly easy way to setup many types of environments. With a couple simple commands, a ready-to-use environment can been deployed. Normally, I’ve used containers to stand up consistent and repeatable development environments.

Terraform Cloud has added a brand-new tier with some fantastic new features. As part of this release announcement, Armon Dadgar and I take a look at Terraform Cloud’s new Business tier and the features which enterprise environments have really come to expect. From multiple SSO (single sign-on) integrations to self-hosted agents to overcome any segmented networks as well as a massive increase in visibility thanks to an integration with Splunk, this is a massive improvement in features that the Terraform Cloud service offers.

A while back, I transitioned my blog from Wordpress to Hugo. The transition was relatively painless, mainly due to the number of tools available to manage that process. However, this blog isn’t about the transition. We’re going to be talking about some of the nuances in using Hugo as your blog platform of choice and how to make those easier. The first thing to make clear, Hugo isn’t a hosting platform.

The usage of Kubernetes continues to grow for most organizations. HashiCorp Terraform is used to simplify the Kubernetes deployment and management process by defining the necessary components as code. This can also be taken a step further by configuring guardrails, which help to protect infrastructure changes that may go against the business’ policies or don’t follow regulatory policies. Similarly to the infrastructure, we can define these policies as code with Sentinel.

Recently, there was a great new resource added to the GitHub Marketplace which allows us to easily integrate the HashiCorp Terraform CLI into the CI/CD process that’s been made available with GitHub Actions. The HashiCorp Setup Terraform action is also available as a starter workflow, accessible directly within the Actions tab of your GitHub repository. GitHub Actions GitHub Actions make use of a YAML-formatted workflow file. This file establishes the configuration for things such as what events should include action processes, the type of runner to use, and the configuration of the jobs with the required steps to reach completion.

The good folks at vBrownbag always have some really good series. The current series is around using Python and applying its usage to DevOps concepts and principles. I was asked to present one of the sessions and decided to cover a few tips and tricks for folks with an existing knowledge of PowerShell and how they can translate directly to Python usage. vBrownbag - Python for DevOps - Python for PowerShellers

PSConfEU is Europe’s largest PowerShell conference and this year I had the distinct pleasure of not only attending, but also presenting! It is 4 days of non-stop PowerShell knowledge bombs in just about every area you can think of and I had a blast! If you have a chance to attend, definitely take advantage of it. Luckily, if you missed out, all the sessions were recorded (huge shoutout to Thorsten Butz for all the hard work he put in) and are available on YouTube: PSConfEU Playlist

PSConfEU is Europe’s largest PowerShell conference and this year I had the distinct pleasure of not only attending, but also presenting! It is 4 days of non-stop PowerShell knowledge bombs in just about every area you can think of and I had a blast! If you have a chance to attend, definitely take advantage of it. Luckily, if you missed out, all the sessions were recorded (huge shoutout to Thorsten Butz for all the hard work he put in) and are available on YouTube: PSConfEU Playlist

Keeping a VM’s VMware Tools up-to-date is an important role to anyone whom administers a VMware environment. VMware Tools provide the latest and greatest drivers and provides easy access in order to interact with the underlying guest OS, like performing power options gracefully. However, there was also a recent security issue announced through the VMware Security Advisories page under Advisory ID: VMSA-2019-0009 More information about this advisory can be found in the following blog post: Security Issue with VMware Tools: VMSA-2019-0009

Referencing documentation is one of those things that seems overwhelming at first, but ends up becoming fundamental. I’m working on a series of posts where I take specific tasks and show how to refer to the documentation to accomplish the task. In this blog, we’ve been tasked with reporting information about a VM’s disk space and its associated filesystem. We’ll be working with the built in PowerCLI .Net objects.

Referencing documentation is one of those things that seems overwhelming at first, but ends up becoming fundamental. I’m working on a series of posts where I take specific tasks and show how to refer to the documentation to accomplish the task. In this blog, we’ve been tasked with reporting information about a VM’s disk space and its associated filesystem. We’ll be working with the vSphere objects. vSphere objects are vSphere Web Services API based, which means that we’ll be using a different set of documents to pull the same data.

Documentation is an important part of the automation and development process. When I first started using PowerCLI, I found the docs to be overwhelming and confusing. As my PowerCLI knowledge grew, I started to use them more and more. Instead of frustratingly browsing the multiple levels of properties that make up vCenter objects in a terminal, I found I could easily pick them out in the docs. As part of this blog post, we’re going to walk through the available documentation, which documentation should be used at what points, and then walk-through two use cases of using the documentation to perform a task.

I found something interesting the other day, from someone that mentioned it to me, you can’t move a datastore cluster between folders in the UI! No worries, PowerCLI to the rescue! Move-DatastoreCluster PowerCLI doesn’t have a high-level cmdlet for this action, so we’ll be creating our own. To perform this action, we’ll be using a method that’s available in the vSphere API known as “MoveIntoFolder.” We can see some additional information about this method in the VMware Code API Explorer: MoveIntoFolder Method I have created and shared a script on the PowerCLI Community repository and the VMware Code Sample Exchange which takes that “MoveToFolder” and wraps it in an advanced function we can call with: Move-DatastoreCluster

A recent knowledge base (KB) article was released regarding an issue impacting a specific version of VMware Tools. The KB in question is 57796, which describes the possibility of a guest level network connectivity issues or even a purple diagnostic screen (PSOD). Before getting to the discovery process, I want to cover some of the specifics for this KB. I do this because we’re going to need to be aware of these as we build out our one-liners and the subsequent reporting script.