Deploying the MSDTC in a Hightly Secure Environment

This is just a fancy way of saying you need to be better friends with who ever is managing your enterprise firewall. I hadn’t had to touch the DTC until a recent vendor insisted their application wouldn’t work without it (despite their only having a single data store). The MSDTC was developed to coordinate transactions that would span multiple machines and was originally introduced in SQL Server 2000.

In theory it’s not super complicated: just enable the DTC service/communication on the servers in question and turn on some built in firewall rules on the servers right? Almost.

First you need to enable the DTC service. This is located in the Component Services manager:Screen shot of the Component Services Manager expanded out to show the Local DTC service.

Right click and open the Local DTC properties and select the Security tab: Screenshot of the security properties tab for the Local DTC service. We want to check the option Allow Network DTC Access as well as Allow Remote Clients. Under the Transaction Manager Communication the best practice would be to Require Mutual Authentication as our machines should ideally be on the same domain and aware of each other. No Authentication Required is a hold over and should only be used if you meet certain–sad–conditions. Additionally you may want to use a service account to run the DTC. (In hindsight it may have made my enterprise firewall adventures less adventurous to have done this: it would have made creating the enterprise firewall rule much more simpler: just allow the service account to the do the things!)

Once that’s out of the way there are 3 inbound and 1 outbound Windows firewall rules that need to be enabled to allow DTC traffic. This can be done manually or with the following Powershell Command.

Enable-NetFirewallRule -DisplayGroup "Distributed Transaction Coordinator

You’ll also need to set up a couple more rules to allow RPC calls to make it through the Windows Firewall. This page will you walk you through those rules.

It’s likely that if you are in a highly secure environment you’ll need to ensure the DTC transactions run on predictable ports. This article walks you through the process of configuring the DTC service to run on a specific port range. You’ll need to ensure there are firewall rules in place (both local and at the enterprise level) to allow that traffic.

Once you think you have everything configured and your firewall rule request has been closed out by information security it’s time to test it to be sure everything is working. Thankfully it’s PowerShell to the rescue again. The WinRM tools provides a cmdlet for this (please note it must be run in a console running as admin):

Test-Dtc -LocalComputerName "$env:COMPUTERNAME" -RemoteComputerName "mytargetserver" -ResourceManagerPort -[portnumber] -Verbose

Important notes from working my way through this: WinRM service will be making http calls on port 5985. Similarly the diagnostic tool starts a resource manager that defaults to running on port 0 unless you pass it a different option, I found it easiest to assign it a port at the end of the dynamic range I’d assigned to the DTC. The machines also need to be able to ping one another.

This week was not the most fun but hopefully this save someone (or even future me) some time in the future.

Other helpful links:

MS MSDTC documentation

Documentation of the Test-Dtc cmdlet

Summit 2019 Post Mortem

Last week was a very excellent PASS Summit (made somewhat bittersweet by the fact that we don’t know exactly when we’ll return to Seattle) and I wanted to captures some of the things I brought back from a high level.

Big SQL News

A couple of announcements of note were made the first of which being the promotion of SQL 2019 to GA. 2019 includes such fancy features as the ability to perform data classification via metadata tables (as opposed to attaching that information to a columns extended properties). This feature will inter-operate with auditing to allow you to target the info that needs the most attention in your database. Accelerated Database Recovery to speed up rollbacks and make transaction log work more efficient. The ability to run R natively in SQL Server has been expanded to much larger collection of languages via sp_execute_externalscript. Better support for Docker and Kubernetes and more.

Azure Arc (currently in preview): This is crazy to me (in a fantastic way) but we can now run Azure services on prem! This makes infrastructure as code or the ability to allow teams to self-service their server needs that much easier. Plus you get the benefit of Azure threat analysis for those locally running services too. Currently it’s limited to Azure SQL Database and Azure Database for PostgresSQL Hyperscale. (More here.)

Azure Data Warehouse is now Azure Synapse Analytics: Microsoft is working to extend the data warehouse by connecting it more to our data lakes and analytics to provide more bang for the buck. Deep integration with AI and BI allow that workload to sit immediately next to the data in a unified environment.

On the Classroom Side

I believe the learning pathways they tried out this year were a success and overall at every time slot I was forced to make hard choices about what session to attend. The new MS certifications for Azure Data Engineer are an excellent area for career growth that brings together varied data wrangling skills into a new discipline to provide data for reporting and analysis in our organizations. The learning pathways started with a general understanding of this new role before spreading out to cover the various tools and Azure areas where Data Engineers will be spending their time. Similarly the AI learning pathways had a number of good sessions ranging from an introduction to AI, an overview of the process behind machine learning and how DBAs can better support their data scientists within the SQL Server stack.

I’m a privacy/data ethics nerd so I attended a couple of sessions onĀ  data masking and data cataloging that were interested and really drove home that very few of us are ready for GDPR (or GDPR-like legislation) even now that the GDPR is here and in effect. (Up to 50% of UK companies are still not GDPR compliant!) There were also a number of excellent professional development sessions on technical leadership or helping drive organizational change as part of IT.

My favorite part was how much certain things came up over and over again no matter what session I was in: things like PowerShell (and specifically dbatools) as well as Azure Data Studio (the ability to direct folks internally to a tool that isn’t SSMS for data exploration has been a big driver at my org) especially the further development/integration of Jupyter Notebooks.

But my truly favorite part (as always) was the ability to connect up with other data professionals, face-to-face. To be able to shake the hand of the people that work so hard on dbatools, give feedback to the Azure Data Studio team or even just find another DBA facing similar challenges so we can cry in our beers together. PASS Summit has definitely been the biggest driver of my career learning since I started attending in 2014. The good news is you can register now for Summit 2020 in Houston: I hope to see you there!