How to Automate Azure Data Factory Deployment with Azure DevOps

Virtually every part of modern life generates data, from using a credit card to shop for groceries to driving a car to work. Because of this huge volume of data, there needs to be a way to not only store the information but orchestrate and operationalize it into actionable business insights. Azure Data Factory is a managed cloud service that does just this. Using Azure DevOps, we can implement continuous deployment practices to automate Azure Data Factory deployment.

Getting Started

Before integrating Azure Data Factory with Azure DevOps for your automatic deployment, you must first ensure all prerequisites are met. First, you will need an Azure subscription linked to Azure DevOps Server or Azure Repos that uses the Azure Resource Manager (ARM) service endpoint. Next, you will need a data factory configured with Azure Repos Git integration. You will also need an Azure key vault that contains the secrets for each environment.

Azure Data Factory Integration with Azure Pipelines

With all prerequisites met, you’re now ready to set up your Azure Pipelines release. In Azure DevOps, open the project that holds your data factory. Then, open the tab for releases and select the option to create a new release pipeline. For this pipeline, you’ll choose the Empty job template. With the pipeline created, you’re going to modify it by adding an artifact. Here, that artifact is the Git repository configured with your data factory. Next, you’re going to add an ARM deployment task and configure it for this job.

If you have secrets to pass in an ARM template, it is recommended to use Azure Key Vault in your release. To do this, simply add an Azure Key Vault task before the ARM task in your pipeline. It is also recommended to keep separate key vaults for each environment.

Automating Deployment

Your release pipeline is complete, but now we want to make sure your deployment is automated. This can be done using release triggers. When the trigger conditions are met, the pipeline will automatically deploy your artifacts to the environment specified. To allow your release to move from environment to environment in your release pipeline, you’ll want to set up stage triggers. To set this up, click the lightning icon on your environments and set up your pre-deployment conditions, including what condition(s) specifically you want the trigger to be.

You now have an automated Azure Data Factory deployment! For more information, or to get started, contact our team of experts here at PRAKTIK.


Azure DevOps Server 2022 Final Release

This month, the final release of Azure DevOps Server 2022 was made public. This final release was a roll up of bug fixes and features from previous release candidates. Let’s discuss some of the feature highlights you can expect with this latest version of Azure DevOps Server.

Delivery Plans

Delivery Plans provide an interactive calendar view of multiple team backlogs. It allows you to view a timeline view of the work, progress of the work, and dependency tracking. There are two main views of delivery plans: Condensed and Expanded. Condensed is more beneficial for at-a-glance information, while Expanded gives a fuller view. Both can be helpful visualizations of a project to enable effective planning and delivery.

This feature is available without an extension for Azure DevOps Services and Azure DevOps Server 2022.

Widget Improvements

The Group By Tags chart widget is now available by default. When using the widget, there is an option for tags. This can allow you to visualize your information by selecting all tags or a set of tags in the widget. Additionally, you can now display custom work items in your burndown widget. To try it out, browse the widget catalog.

Generate Unrestricted Token for Fork Builds

When Azure Pipelines builds contributions from a fork of a GitHub Enterprise repo, it restricts permissions and doesn’t allow pipeline secrets to be accessed. This can be more prohibitory than necessary in closed environments. While there are pipeline settings to make secrets available to forks, there is no setting to control the job access token scope. However, this feature allows you to generate a regular job access token, even for fork builds.

To get a full overview of the many features available in Azure DevOps Server 2022, visit the release notes. For more information, contact our team of experts here at PRAKTIK.


Build and Deploy Apps in a Private Kubernetes Cluster with Azure DevOps

Security is a top concern for many developers and consumers alike. This is especially true if you’re in the financial or government sectors with a lot of sensitive or classified information. Many of us also want to use container orchestration tools like Kubernetes for deployment to allow for faster time-to-market and simplified scalability. One way to ensure the security of your application, as well as to take advantage of Kubernetes, is by deploying to a private Kubernetes cluster using Azure DevOps.

Why a Private Cluster?

A private cluster is just that: private. But how exactly? The API server is how you can control and access your Kubernetes control plane. By using a private server, you are ensuring that all network traffic between your API server and your node pools will remain on the private network only. They will communicate through the Azure Private Link service in the API server virtual network and a private endpoint that is exposed in the subnet of your AKS cluster.

Build and Deploy in the Private Kubernetes Cluster

Since your AKS cluster is only accessible within the virtual network, you’ll need a self-hosted agent within the same virtual network. Therefore, you’ll create a virtual network. Next, you’ll create a private Azure Container Registry (ACR), as well as the registry’s private endpoint that you’ll use to integrate AKS with ACR. With your AKS cluster created and integrated, you’ll need a virtual machine to host your agent. This virtual machine will live within the virtual network with your AKS cluster.

After you’ve deployed your agent on the virtual machine, you’ll create the pipeline you want to build and deploy your app with. You can do this from Azure DevOps Services or from your own instance of Azure DevOps Server. It may seem a little counter-intuitive to use Azure DevOps Services for your build and deployment because it is in the public internet. However, by using a service endpoint, your virtual network resources use private IP addresses to connect to Azure DevOps Service’s public endpoint. This effectively extends the identity of the virtual network to the target resource. Additionally, traffic flows over the Azure backbone instead of over the internet. Therefore, you can take advantage of the ease and power of Azure DevOps Services while still maintaining the level of security and privacy required by your organization.

For more information, or to get started today, contact our team of experts at PRAKTIK.

 

 

 


Aha! Roadmaps Integration with Azure DevOps

In order to develop applications successfully, we need to know where we’re going, how to get there, and how we’re doing so far. We also need to be able to coordinate across teams, even if those teams use different tools in their day-to-day. Aha! Roadmaps is tool to help you effectively strategize across teams. In this article, we will discuss Aha! Roadmaps integration with Azure DevOps Services and Azure DevOps Server.

How It Works

Aha! Roadmaps is one of a suite of Aha! tools that enable collaboration between all teams, from developers to product managers. Historically, teams may have used different tools to track their progress and inform their strategy. This meant extra work for collaboration, which sometimes meant that collaboration wasn’t prioritized as it should be. Aha! Roadmaps is a singular hub for all things strategy, meant to be used for collaboration between cross-functional teams. It is also made to integrate with multiple tools, so that your teams can work where they’re comfortable. This integration provides real-time, two-way updates that are fully customizable to your team’s workflow and terminology.

Integration

To get started with integrating Aha! Roadmaps with Azure DevOps, you’re going to start in Aha! Roadmaps and build or import your records. To do this, as a workspace owner, you will simply add a new cloud or on-premises integration from the workplace setting in Aha!. This will launch the integration wizard, which will ask you to create a template and authenticate your Azure DevOps credentials. After authentication, you’ll get to choose your project and start configuring your integration mappings.

If you’d like to take advantage of two-way sync for updates, you’ll set up webhooks in Azure DevOps using the webhook URL in the Aha! integration configuration. You can also use webhooks to send security-related events to a SIEM system, or to stream activity to a third-party tool. If you want additional integration security for your Azure DevOps Server instance, you also have the option to include a client certificate in your integration settings.

Aha! Roadmaps integrates with Azure DevOps to allow you to be as productive as possible, all while enabling that all-important cross-team collaboration. For more information, or to get started today, contact our team of experts at PRAKTIK.


Test-Drive Feature Readiness Before Release

Open-source development has enabled greater collaboration than ever before. Whole communities of people who have never met in person can work together to create just about anything. However, this does not come without its challenges. One such challenge is quality control. Even with strict pull request requirements, it can be difficult to ensure every pull request receives proper testing. Kubernetes preview environments and build validations in Azure Pipelines can make this easier.

Preview Environments in Kubernetes

Kubernetes is an open-source, container orchestration tool that automates deployment, scaling, and management of containerized applications. It enables developers to deploy quickly into test and production environments. A preview environment is an ephemeral environment created with the code of your pull requests. By using a preview environment, you can see your changes live and test-drive features before merging to master. Furthermore, by using namespaces within your Kubernetes cluster, you can test your changes in a fully isolated environment that can be destroyed when you’re finished by simply closing the PR.

To deploy pull requests for review with Azure DevOps, you need to add a build validation branch policy to your Azure Pipeline.

Build Validations

Build validations are tests that run on a build to check changes made before a release. In the Repos submenu in Azure DevOps, you’ll define the requirements for pull requests that are being made against your selected branch. With these policies in place, every time someone creates a new pull request targeting the branch you defined the policy for, a reviewer can manually decide to deploy the changes to a dedicated Kubernetes namespace for detailed review within your preview environment. Alternately, you can choose to have this deployment to the namespace happen automatically.

In modern DevOps development, deploying quickly and often is critical. Equally important is making sure the code you’re putting into production is valuable to your end users. For more information, or to get started today, contact our team of experts here at PRAKTIK.

 

 

 

 

 

 

 


DevOps Deployments and Ansible

Doing things by hand is an accident-prone method of development. DevOps best practices state that we should automate as much as possible. Automation helps developers be more productive elsewhere, in addition to eliminating human error. We can follow this best practice by using tools that enable things like Continuous Integration and Continuous Deployment. One such tool is Ansible.

Ansible Basics

Ansible is an open-source tool that enables automation in your environment. In the past, you may have needed to manually coordinate to deliver an application to your end users. Now, Ansible can do that work for you, from automating cloud provisioning, application deployment, configuration management, and more.

Ansible can also help orchestrate zero-downtime deployments to deliver the best experience to your end user. This is especially important in our fast-paced world; consumers expect to be able to reach their services at all times, while simultaneously desiring the services to be continuously improved.

How Does Ansible Work?

Getting started with Ansible is as simple as describing your automation job in YAML. This description takes the form of an Ansible Playbook. When going through your DevOps CI/CD pipeline, Ansible uses this playbook to provision everything you described for your deployment in the exact same way, every time. This means that your deployments are simple and repeatable, without any human error.

To create and provision resources in Azure, you’ll need a Linux VM with Ansible configured. Additionally, as Ansible is agentless, it will need SSH authentication using a key pair and an SSH service connection in Azure DevOps.

To learn more, or to get started with Ansible today, contact our team of experts here at PRAKTIK.


Jenkins or Azure Pipelines?

Jenkins is an open-source automation server that helps facilitate continuous integration. It is installed on-premises and managed there. Azure Pipelines is a continuous integration tool that is available in the cloud or on-premises, and can manage build and release orchestration. Both are reasonable and popular options, but which is truly the best for your situation?

Simplicity

From an organization perspective, it is important to determine which tool is going to get you to your desired end result faster. While it is certainly true that Jenkins and Azure DevOps can integrate with one another, it is important to remember that there is an additional time-based cost for this integration. Using more than one tool requires an additional investment in training and maintenance. For many cases, reducing the number of tools used is optimal, especially if the additional tools are redundant.

In addition, Azure Pipelines natively integrates with things like Git repos and Azure Boards. This kind of integration is hard to pass up, especially because it provides a seamless end-to-end traceability matrix of code and items across releases.

YAML

YAML allows a developer to define the pipeline as code. While using YAML to define pipelines isn’t the right solution for all teams, it can be a powerful tool with certain benefits. For instance, the pipeline itself is managed as a source file, so it will go through the standard code review process, therefore increasing quality. It is also easier to compare different versions of the pipeline if something breaks. Azure Pipelines has a YAML interface in addition to the standard GUI; Jenkins does not.

Cost

Something we always need to take into consideration is the total cost of a solution. An Azure DevOps instance already has all the infrastructure for running pipelines built in. Not only this, Azure Pipelines comes with 30 hours of free Microsoft-hosted builds a month, or unlimited build minutes for a self-hosted job. With Jenkins, this is not the case. While Jenkins is open-source and therefore free to use, you would be responsible for deploying and maintaining your build infrastructure, as well as paying for it.

This is not a debate about which technology is better. Both are mature enough to cover all build requirements for most companies. This is about deciding which option works best for you and your teams with the least amount of friction. For more information, or to speak to one of our experts, contact us today.


Integrating Power BI and Azure DevOps Analytics

Collecting and analyzing data is an imperative part of developing a successful application. You need to be able to determine if the needs of your end user are being met. But with the sheer volume of data points available, it can be difficult to understand what needs to be improved. Power BI is a tool that helps convert your data into easily readable insights. Power BI and Azure DevOps can work together to provide reports and analytics to fit your needs.

Azure DevOps Analytics

Analytics is the reporting platform for Azure DevOps. It provides data from Azure DevOps you can use to improve your application. For instance, you can access Azure Pipelines analytics with metrics like run failures to improve your code, pipeline, or tests. You can also create Widgets for Azure Boards to track things like Burndown, Cycle Time, and Velocity.

Power BI Integration

Analytics is great for collecting data from within Azure DevOps, but what if you need to pull in data from other sources? This is where Power BI comes in. Power BI allows you to pull in data from any source with a connector. You can even pull Azure DevOps Analytics data into it, allowing you to get a fuller picture of all your data points in one place.

You can pull data from Analytics into Power BI in three ways, but the recommended way is to connect using the OData queries. The advantage of these queries is that they are powerful and very specific, which means only the data you want is returned to you. You can also pre-aggregate data server-side, which means data is collected and analyzed then presented to you as summarized findings in Power BI. There is no need to pull all the detailed data down, saving you valuable time.

With Power BI, you can create organization-level metrics for high-level information, or you can drill down into specifics that allow you to address problem areas. You can also create project health reports to look at bug trends, build success rates, and other specific metrics at the organization level. These types of data points are critical when it comes to the success of your application.

As of this writing, Power BI integration with Analytics is currently in Preview. For more information on Power BI, or to get started today, contact our team of experts at PRAKTIK.


SonarQube—Azure DevOps Integration

The most important parts of any project are quality and security. While it’s important to have a solid user experience, it’s equally important to maintain security standards. This can be accomplished with SonarQube. This tool can be integrated with Azure DevOps to give you data where you need it, such as in your Pipeline and Pull Requests.

What is it?

SonarQube is a self-hosted code analysis services that detects issues to ensure the reliability, security, and quality of your project. It finds issues in your code and provides guidance on how to best address them. You can also use this tool to add Quality Gates to your CI/CD workflow. If the quality parameters are not passed, the job fails so you can correct it before it rushes into production. Additionally, SonarQube decorates your issues directly in your Azure DevOps Pull Requests, which will help you deal with them sooner. SonarQube is free for open-source projects; you’ll only pay when you start analyzing private repositories. In this article, we will be focusing on the cloud-hosted version of this product called SonarCloud.

How to Integrate with Azure DevOps

Integrating SonarCloud with AzureDevOps is as simple as installing the extension from the Visual Studio Marketplace and following the setup flow on SonarCloud’s website. The flow will ask you for things like your Azure DevOps Organization name and a Personal Access Token. Then, you’ll set up a SonarCloud organization and project, as well as choose a plan for your SonarCloud subscription. If all the repositories you want to analyze are public, you can choose the free plan. You’ll only pay if you analyze a private repo.

Now you’re ready to set up your analysis. To do this, follow the SonarCloud walk-through to set up scanning in Azure Pipelines. This analysis will be done during your build. You can include Quality Gates that will cause the build to fail if it does not pass the quality check. After your build runs, you’ll be able to view the Detailed SonarCloud Report in the build summary. Additionally, you can set up pull request integration that will allow the Azure DevOps UI to display when an analysis build is running. This is done by configuring your build policy in Azure DevOps, as well as giving SonarCloud access to your pull requests. The results are visible directly in Azure DevOps or on the SonarCloud dashboard.

The information provided by SonarCloud and its integration with Azure DevOps is invaluable. Now, you’ll be able to identify and repair issues faster and more efficiently. For more information about SonarCloud, or to get started today, contact our team of experts at PRAKTIK.


Using Terraform with Azure DevOps

Managing infrastructure can be tricky business. Development teams must maintain the settings for each individual deployment environment. Over time, environments can become difficult or impossible to reproduce. This would mean that hard-to-track manual processes would have to be used to create and maintain these environments. Instead of going through this headache, we can use an Infrastructure as Code (IaC) tool called Terraform.

What is Terraform?

Terraform is an open-source IaC tool for provisioning and managing cloud infrastructure. This tool allows users to define a desired end-state infrastructure configuration. It will then go ahead and provision the infrastructure exactly as described. It can also safely and efficiently re-provision infrastructure in response to configuration changes. This means that your infrastructure will be exactly the same every time.

Using Terraform with Azure DevOps

As with many technologies, there is an extension in the Visual Studio Marketplace to make your life easier. To get started, simply install the Terraform extension. This extension provides service connections for AWS and GCP for deployment to Amazon or Google clouds – you’ll need to create an Azure Service Principal if you’re deploying to Azure. It also includes a task for installing the required version of Terraform on your agent, as well as a task for executing the core commands. This extension requires some configuration, such as defining your provider and which command you want the tool to execute.

After your configuration is complete, you are able to include the Terraform task in your Build or Release Pipeline to manage your infrastructure automatically.

For more information about Terraform, or to get started, contact our team of experts at PRAKTIK.