Enable modern authentication for Exchange Online via PowerShell

Modern authentication is disabled in Exchange Online in Office 365  by default. However, you are quite likely to want modern authentication, because modern authentication in Office 365 enables authentication features like multi-factor authentication (MFA) using smart cards, certificate-based authentication, and third-party SAML identity providers.

You can enable modern authentication in Exchange Online via PowerShell. However, I found the article explaining how to enable modern authentication for Exchange Online is missing some detail regarding how to connect to Exchange Online.

For reference, below is a sample script for connecting to Exchange Online

# Capture your credentials to a credential object 
$UserCredential = Get-Credential

# Establish a remote connection to EO in your O365 tenant
$Session = New-PSSession -ConfigurationName Microsoft.Exchange `
-ConnectionUri https://outlook.office365.com/powershell-liveid/ `
-Credential $UserCredential -Authentication Basic -AllowRedirection

Import-PSSession $Session

# Check if modern auth is in place already 
Get-OrganizationConfig | Format-Table -Auto Name,OAuth*
 
# If modern auth setting is false, then enable it
Set-OrganizationConfig -OAuth2ClientProfileEnabled $true

# Check again to ensure it comes back as "True"
Get-OrganizationConfig | Format-Table -Auto Name,OAuth*

Questions or comments? Use the comments section below.

Enable modern authentication for Skype for Business Online via PowerShell

Modern authentication is disabled in Skype for Business Online in Office 365  by default. However, you can enable it via PowerShell. The article explaining how to enable modern authentication in Skype for Business in Office 365 is missing a small item or two (like where to get the PowerShell snap-in you need.

For reference, below is a sample script with the basics in one place.

First, download and install Skype for Business PowerShell snap-in:
https://www.microsoft.com/en-us/download/details.aspx?id=39366

Run these commands one at a time, as you will need to paste a value into line 10, as mentioned in the comment on line 9 below.

# Capture your credentials to a credential object 
$UserCredential = Get-Credential

# Connect to Skype for Business in your Office 365 tenant
$session = New-CsOnlineSession -Credential $UserCredential -Verbose

Import-PSSession $session

# Paste the value from Name field to replace tmp_vubyaegp.m3f below 
Get-Command -Module tmp_vubyaegp.m3f

# Enable Skype for Business tenent to support modern auth
Set-CsOAuthConfiguration -ClientAdalAuthOverride Allowed

# verify success
Get-CsOAuthConfiguration

Questions or comments? Use the comments section below.

5 ways to secure your SQL data in Microsoft Azure

Data security in the cloud is of chief concern not only to healthcare and financial services, but anyone with sensitive data of any kind that should only be disclosed to authorized parties. No discussion of enterprise security would be complete without a look at data protection and governance.

For purposes of this discussion, data comes in two forms:

  • Structured. Structured data refers to kinds of data with a high level of organization, such as information stored in a relational database, as in Microsoft SQL Server.
  • Unstructured. Unstructured data refers to data that is not contained in a database or some other type of data structure. Examples include email messages, Word documents, PowerPoint presentations and instant messages.

Important considerations in data protection and governance include data classification and rights management, encryption at-rest and in-flight, as well as management and storage of encryption keys and other secrets related to securing data.

Securing Structured Data In-Flight & In Use

SQL Server 2016 (both SQL in VMs and Azure SQL) introduces some new capabilities to prevent unintentional leakage of data by misconfigured applications or security controls. Key highlights are listed below:

#1 Always Encrypted:

This is a client-side encryption capability, enabling the application to encrypt data so the SQL server (or service if using Azure SQL) can never see the data. This is particularly useful for protecting content such as SIN/SSN, Credit Card, and private health identifiers.

Always_Encrypted

#2 Row-Level Security:

This allows the organization to create policies which only return data rows appropriate for the user executing the query. For example, this allows a hospital to only return health information of patients directly related to a nurse, or a bank teller to only see rows returned which are relevant to their role. For more info, see https://msdn.microsoft.com/en-us/library/dn765131.aspx.

#3 Dynamic Data Masking:

This allows the organization to create policies to mask data in a particular field. For example, an agent at a call center may identify callers by the last few digits of their social security number or credit card number, but those pieces of information should not be fully exposed to the agent. Dynamic Data Masking can be configured on the SQL server to return the application query for the credit card numbers as XXXX-XXXX-XXXX-1234.

Dynamic_Data_Masking

These capabilities help prevent and mitigate accidental exposure of data while it is in-flight or in-use by a front-end application. For more info, see https://msdn.microsoft.com/en-us/library/mt130841.aspx.

Securing Structured Data At-Rest

Protection of SQL data at-rest is a feature that has been around for a long time now, which the SQL Server product team at Microsoft has enhanced in the 2016 release.

#4 SQL Transparent Data Encryption

In order to protect structured data at-rest, Microsoft first introduced SQL Transparent Data Encryption in SQL Server 2008. This technology protects data by performing I/O encryption for SQL database and log files. Traditionally a certificate that SQL Server manages (and is stored locally within the SQL master database) would protect this data encryption key (DEK). In June 2016, Microsoft made a significant enhancement to this capability by making generally available a SQL Server Connector for Azure Key Vault.

AKV

Image credit: Microsoft

This allows organizations to separate SQL and Security Administrator roles, enabling a SQL Administrator to leverage a key managed by the security operators in Azure Key Vault, with a full audit trail should the SQL administrator turn rogue. This connector can also be used for encrypting specific database columns and backups, and is backward compatible all the way back to SQL 2008.

More info at https://msdn.microsoft.com/en-us/library/dn198405.aspx

Detecting SQL Threats

In addition to securing SQL data, we also need to consider protecting data sources from the threats that would lead to breach.

#5 SQL Threat Detection

Running SQL in the cloud brings some additional benefits. For databases running on the Azure SQL service, the new SQL Threat Detection service monitors database activity and access, building profiles to identify anomalous behavior or access. If suspicious activity is detected, security personnel can get immediate notification about the activities as they occur. Each notification provides details of the suspicious activity and recommendations on remediating the threat.

SQL Threat Detection for Azure SQL Database can detect threats such as the following:

  • Potential Vulnerabilities: SQL Threat Detection will detect common misconfigurations in application connectivity to the SQL data, and provide recommendations to the administrators to harden the environment.
  • SQL Injection Attacks: One of the most common approaches to data extraction is to insert a SQL query into an unprotected web form, causing the form to return data that was unintended. SQL Threat Detection can identify if an attacker is attempting to leverage this mechanism to extract data.
  • Anomalous Database Access: If a compromised database administrator account starts to execute queries from an abnormal location, SQL Threat Detection can detect and alert on the potential insider threat or identity compromise, enabling the security personnel to update firewall rules or disable the account.

SQL Threat Detection for Azure SQL Database is a powerful new tool in detecting potential data leakage threats. For more info, see https://docs.microsoft.com/en-us/azure/sql-database/sql-database-threat-detection.

I hope you’ve found this short read on some of Microsoft’s capabilities for protecting structured data valuable. Questions or comments? Feel free to leave your thoughts in the comments section at the end of this article.

Remoting in Azure Automation Runbooks (SQL DB Creation sample)

I find that using PowerShell remoting in my Azure automation runbooks is sometimes more convenient, as it eliminates the need to install and update additional PowerShell modules on my OMS hybrid runbook worker. For future reference, I wanted to capture an example of a simple approach to PowerShell remoting I find intuitive.

The activities in this relatively simple example of remoting in an Azure Automation runbook include the following:

  • Retrieves user and password info from Azure automation variables, then creates a PsCredential object
  • Remotes from the worker where it is run (the HRW in my case) to a SQL 2014 or 2016 server. (The name of the server and SQL instance are supplied in the $RemoteComputer and $SQLInstance parameters of the runbook)
  • Loads the SQL PowerShell module
  • Creates the SQL database with the default settings, named per the $DatabaseName runbook parameter
  • This sample also includes a trace log to demonstrate where the code executes (on the remote SQL server)
PARAM(

$RemoteComputer,
$SqlInstance,
$DatabaseName

)

# Retrieve admin user and password, create credential object
$strScriptUser = Get-AutomationVariable -Name 'ContosoAdminUser'
$strPass = Get-AutomationVariable -Name 'ContosoAdminPassword'
$PSS = ConvertTo-SecureString $strPass -AsPlainText -Force
$cred = new-object system.management.automation.PSCredential $strScriptUser,$PSS

#Invoke script block on the remote SQL server
Invoke-Command -Computername $RemoteComputer -Credential $cred -ScriptBlock {

# Import SQL Server Module called SQLPS
Import-Module SQLPS -DisableNameChecking

# Simple log to prove we are remoting
# Logs to c:\Windows\Temp\remoting.txt
$CurrDate = Get-Date
$message = "We remoted to $env:ComputerName on $CurrDate"
$file = "c:\Windows\Temp\remoting.txt"
$Message | Add-Content -Path $file

# Your SQL Server Instance Name
$SqlInst = "$using:SqlInstance"
$Srvr = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $SqlInst

# Database PSDB with default settings
# By assuming that this database does not yet exist in current instance
$DBName = "$using:DatabaseName"
$db = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Database($Srvr, $DBName)
$db.Create()

}

Searching files in a zip archive in memory with PowerShell

I found myself tonight needing to build a function to search the contents of text files in a very large zip archive to find one containing a specific value. To handle the operation quickly, I wanted to perform the operation in memory. While that may be too specific to be useful for most, I thought at least an example of how to retrieve a file within a zip archive and parse its content might be interesting to a wider audience, and more importantly, a useful archive for myself.

P.S. – I am happy to share the more complex end result if anyone tells me it’s useful to them.

What you will need

While the new extract-archive and compress-archive cmdlets are handy for basic zip archive creation and extraction, they are not much help when you need to get down to the item level within a zip archive. The system.io.compression assemblies are required, which you must load explicitly, as they are not loaded by default. For this simple example, these are key enablers:

ZipFile class, to open the archive to parse members
https://msdn.microsoft.com/en-us/library/system.io.compression.zipfile(v=vs.110).aspx

GetEntry method, to retrieve individual file (or files) in the archive
https://msdn.microsoft.com/en-us/library/system.io.compression.ziparchive.getentry(v=vs.110).aspx?cs-save-lang=1&cs-lang=vb#code-snippet-1

StreamReader class, to read the file into memory and search content
https://msdn.microsoft.com/en-us/library/system.io.streamreader(v=vs.110).aspx

Sample Code

In this example, we will open a zip archive named scripts.zip on the D:\ drive, retrieve a file named ConnectToAzure.txt and search for the value.

Select your zip archive.

$ZipArchive = "d:\scripts.zip"

Open archive for reading.

$ZipStream = [io.compression.zipfile]::OpenRead(“$ZipArchive”)

Select the item in the archive. Notice how you must reference the folder in the path to the file.

$ZipItem = $ZipStream.GetEntry('Scripts/ConnectToAzure.txt ')

Open the item from the archive.

$ItemReader = New-Object System.IO.StreamReader($ZipItem.Open())

Use Streamreader class and read into memory. $DocItemSet represents the contents of the file.

$DocItemSet = $ItemReader.ReadToEnd()

Search the file contents for desired value.

$DocItemSet.Contains("Connect-MsolService")

Happy PowerShelling!

Creating (and deleting) checkpoints for multi-VM sets in Hyper-V

When writing custom PowerShell DSC, I am typically working with complex configurations that involve not only multiple software installs, but database creation and domain join, resulting in changes on multiple VMs. In order to rollback operations on a front-end server, back-end SQL and an AD domain controller, I need snapshots of all VMs at the same point in time. As a quick reminder to myself, I wanted to take a moment to post a couple of sample PowerShell snippets for quickly creating or deleting snapshots on all the VMs in a scenario at once.

I wanted to take a moment to post a couple of sample PowerShell snippets for quickly creating or deleting snapshots on all the VMs in a scenario at once. It’s a time-saver when I want to rollback an entire scenario or take accurate and easily referenceable, incremental checkpoints of my scenarios as I progress. VERY simple, VERY effective, HUGE time-saver.

I prefer production checkpoints, which are the default in Hyper-V on Windows 10. If you want to be sure, look in your VM properties as shown in the image below.

Hyper-V Snapshot

I run scenarios Hyper-V on a Windows 1o laptop (a mobile server really), so I keep my VMs straight by giving them a common naming prefix. For a SCOM test, I may use a prefix in Hyper-V like “SCOM-“.

Creating checkpoints in bulk

Here is a simple PowerShell snippets for creating checkpoints on multiple VMs at once. You will notice it assumes your VMs have a common prefix on their name. A date is appended to the name you provide for the snapshot so you know when you took it at-a-glance.

Function CreateCheckPoint ($VMName, $SnapShotName) {

$VMs = Get-VM | WHERE {$_.Name -like "$VMName"}

$Date = Get-Date -Format g

foreach ($VM in $VMs){

Checkpoint-VM -Name $VM.Name -SnapshotName "$SnapshotName - $Date"

}
}

Call the function above like so, replacing parameter values appropriate to your environment.

CreateCheckPoint -VMName "2016*" -SnapShotName "Pre-2016 upgrade"

Deleting checkpoints in bulk

Here is a simple PowerShell snippets for deleting checkpoints on multiple VMs at once, based on VM naming prefix and description. I put a -confirm parameter on Remove-Snapshot so I am prompted to review and accept deletion. You can respond to the prompt to “accept all”, so it’s a nice failsafe to ensure you don’t delete the wrong snapshot.

Function RemoveCheckPoint ($VMName, $SnapShotName) {

Get-VMSnapshot -VMName $VMName | WHERE {$_.Name -like "$SnapShotName"} | Remove-VMSnapshot -Confirm
}

RemoveCheckPoint -VMName “2016*” -SnapShotName “Pre-2016 upgrade*”

Call the function above like so, replacing parameter values appropriate to your environment.

RemoveCheckPoint -VMName "2016*" -SnapShotName "Pre-2016 upgrade*"

Install SQL Server Mgmt Studio (SSMS) with PowerShell

Below for reference is a quick PowerShell sample for downloading and installing SQL Server 2016 Management Studio (SSMS). While this component used to be included in the SQL Server installer, it is now a separate download.

# Set file and folder path for SSMS installer .exe
$folderpath="c:\windows\temp"
$filepath="$folderpath\SSMS-Setup-ENU.exe"

#If SSMS not present, download
if (!(Test-Path $filepath)){
write-host "Downloading SQL Server 2016 SSMS..."
$URL = "https://download.microsoft.com/download/3/1/D/31D734E0-BFE8-4C33-A9DE-2392808ADEE6/SSMS-Setup-ENU.exe"
$clnt = New-Object System.Net.WebClient
$clnt.DownloadFile($url,$filepath)
Write-Host "SSMS installer download complete" -ForegroundColor Green

}
else {

write-host "Located the SQL SSMS Installer binaries, moving on to install..."
}

# start the SSMS installer
write-host "Beginning SSMS 2016 install..." -nonewline
$Parms = " /Install /Quiet /Norestart /Logs log.txt"
$Prms = $Parms.Split(" ")
& "$filepath" $Prms | Out-Null
Write-Host "SSMS installation complete" -ForegroundColor Green

Questions or comments? Use the comments section below.

3 Truths of Modern Enterprise Security

The threat landscape, specifically the types and sources of threats, has changed significantly in the last few years. There are a number of readily identifiable causal factors, including:

  •  Changes in technology. The introduction of new technology results in weaknesses that are related to low technological maturity, improper use, improper integration with existing systems, low user awareness, etc.
  • Advances in the capabilities of threat agents. The skills, available tools, available resources, information on exploits and motivation of threat agents (sometimes called ‘threat actors’) has evolved. The types of threat agents have evolved from the script kiddies of the 90’s to today’s well-resourced and highly sophisticated hackers, who in some cases are even sponsored by malicious nation states.
  • Data growth. 90% of all the world’s data has been produced in the last two years, and with the growth of myriad devices (the Internet of Things), the target just keeps getting bigger.

With the evolution of the threat landscape comes the need to modernize our thinking and approach to security and identity. Make no mistake, firewalls and antivirus are no longer enough. To help frame some of the key challenges, here are three truths of modern enterprise security, along with some free resources to start you on the journey to modernize your approach.

Truth #1: Your trusted network is not as secure as you think

While your trusted corporate network may seem like the simplest resource to secure, it may be, in fact, the most vulnerable. Some of the most common points of entry to your trusted network are through browser exploits, malicious document delivery, and phishing attacks. What these exploits all have in common are that they target what is perhaps the greatest vulnerability on your network – the end user. The reality is that trusting users can be fooled into clicking malicious URLs. This may result in the opening of infected e-mail attachments that install malware or ransomware on client computers, letting hackers and thieves through your secure network perimeter undetected.

This malware often lives undetected on your trusted network for an average of more than 200 days, listening to conversations, waiting to uncover network credentials, then stealing these secrets that enable lateral movement through your environment. This challenge is compounded by compromising more systems and uncovering more credentials, eventually enabling vertical movement from client to server.

Think your users are too smart to be lured into a phishing scam? Just ask Chairman of the Clinton presidential campaign, John Podesta, who fell for a phishing scam that landed his email archive on Wikileaks!

Truth #2: The network perimeter, as you know it, is history

The traditional model of the network perimeter, including firewalls and proxies and a perimeter network (aka DMZ), is dead. The perimeter, where access and authorization are enforced, can be the login screen on a mobile device, or an app installed on that device. The app is the window to your corporate data (content), and the new perimeter is the content and context by which the user tries to access that data.

The Cloud Security Alliance (CSA) advises that “identity management in the context of cloud computing should not only manage the user identities. It should extend this to manage cloud application/services identities, access control policies for these cloud applications/services, as well as privileged identities for the applications/services, etc.”.

With this in mind, organizations must rethink their approach to identity management, authentication, and authorization in a world that did not exist when the concept of username and password entered on a PC behind a trusted network were conceived. With an increasingly mobile workforce, Multi-factor authentication (MFA) is a must and policy-based authentication that evaluates the full context of the authentication attempt (user, device, location, date/time, app and data) is more important than ever.

And what about the security of your sensitive corporate data on employees’ personal mobile devices, full of unmanaged apps and direct connectivity to personal cloud storage?

Truth #3: Breach will happen…and you need to be prepared when it does

While post-breach detection may feel “too little too late”, it is actually a critical layer of defense, particularly as your efforts to mature your security posture in a race against an ever-evolving threat landscape. When a breach has occurred, detecting both weak spots and actual breaches in the context of your computing environments, as opposed to a single device, is absolutely critical to providing context and visibility into the scope of items that need attention.

It is one thing to see an alert on an infected computer in your trusted network. It is quite another to see lateral movement of a malicious entity in your environment through a suspicious pattern of behavior with a common set of compromised credentials. In this case, detecting and squashing lateral movement at the client tier can prevent the next step in the intrusion process…listening for and capturing privileged credentials that enable vertical movement into server and application tiers containing sensitive business and customer data.

Finding answers to the big questions

While important, these three truths are just the tip of the iceberg and raise some very important questions:

  • How do you defend against the weakest links in your trusted, on-premises network?
  • How do you secure your sensitive corporate information on devices that could be anywhere…and beyond your management reach?
  • What type of post-breach defense can you implement to ensure you have eyes on the presence and scope of a security breach?

Making the big picture of security and identity in a cloud-first, work-from-anywhere world, full of threats that marginalize the efficacy of traditional tools and techniques can seem an impossible task? I have two concise (and free!) resources I’d like to share with you to help you on your journey:

E-book: Defending the New Perimeter: Modern Security for the Enterprise

This comprehensive, yet concise guide to Microsoft’s approach to modern enterprise security will help you get a handle on how you can implement a strong, comprehensive cybersecurity strategy with a single vendor.

Download your free copy at http://modernsecurity.info.

Azure Automation Runbook to Add Computer to ConfigMgr Collection

Below for reference a quick PowerShell sample for adding a computer to a device collection in System Center Configuration Manager (ConfigMgr) via Azure Automation runook, designed for use on an OMS Hybrid Runbook Worker. It uses WMI, and so does not require the ConfigMgr PowerShell cmdlets. This also means it works for ConfigMgr 2012 and 2016.

It was used as an example in my talk “Evolving your automation strategy with OMS” at MS Ignite 2016. You can get this and the other sample from that session from my Git repo at https://github.com/pzerger/Ignite2016. The runbook is pretty well commented, but post questions beneath this post if anything is unclear.

Param(
[Parameter(Mandatory=$true)][PSCredential]$SCCMCred,
[Parameter(Mandatory=$true)][string]$CollectionName,
[Parameter(Mandatory=$true)][string]$ComputerName
)
#Retrieve SCCM site server name from Azure Automation variable
$SiteServer = Get-AutomationVariable SCCMSiteServer
Write-Verbose "SCCM Site Server: '$SiteServer'"
Write-Verbose "Connecting to SCCM Site server using user name '$($SCCMCred.UserName)'."

#Query site server WMI to get site code and SMS provider computer name
$ProviderLocation = Get-WmiObject -Namespace "Root\SMS" `
-Query "Select * from SMS_ProviderLocation" -Credential $SCCMCred -ComputerName $SiteServer
$SiteCode = $ProviderLocation.SiteCode
$SMSProvider = $ProviderLocation.Machine
Write-Verbose "SCCM Site Code: '$SiteCode'."
Write-Verbose "SMS Provider computer name: '$SMSProvider'."

#Get the collection WMI object
$Collection = Get-WmiObject -Namespace "Root\SMS\Site_$SiteCOde" `
-Query "Select * from SMS_Collection Where Name = '$CollectionName'" -Credential $SCCMCred -ComputerName $SMSProvider
If ($Collection){
$CollectionID = $Collection.CollectionID
Write-Verbose "collection '$CollectionName' ID is: '$CollectionID'"
} else {
throw "Unable to find collection with name '$CollectionName'. Unable to continue"
Exit -1
}

#Get the computer resource
$Resource = Get-WmiObject -ComputerName $SMSProvider -Namespace "Root\SMS\Site_$SiteCode" `
-Class "SMS_R_System" -Filter "Name = '$ComputerName'" -Credential $SCCMCred | select name,resourceid
If ($Resource){
$ResourceID = $Resource.resourceid
Write-Verbose "Resource ID for computer '$ComputerName' is: '$ResourceID'"
} else {
throw "Unable to find computer resource for '$ComputerName'. Unable to continue."
Exit -1
}

#Create static membership rule for collection
Write-Verbose "Adding computer '$ComputerName' to collection '$CollectionName' by creating a new static membership rule."
$ruleClass = Get-WmiObject -List -ComputerName "$SMSProvider" `
-Namespace "Root\SMS\Site_$Sitecode" -Credential $SCCMCred -class "SMS_CollectionRuleDirect"
$newRule = $ruleClass.CreateInstance()
$newRule.RuleName = $($Resource.name)
$newRule.ResourceClassName = "SMS_R_System"
$newRule.ResourceID = $($Resource.resourceid)
$AddResult = ($Collection.AddMembershipRule($newRule)).ReturnValue

If ($AddResult -eq 0)
{
Write-Output "Collection `"$CollectionName`" direct membership rule successfully created for computer `"$ComputerName`", requesting refresh now."
$RefreshResult = ($Collection.RequestRefresh()).ReturnValue
If ($RefreshResult -eq 0)
{
Write-Output "Collection refresh successfully requested for `"$CollectionName`"."
} else {
Write-Error "Failed to request collection refresh for `"$CollectionName`"."
}
} else {
Write-Error "Failed to add computer '$ComputerName' as a direct member for collection `"$CollectionName`"."
}

Questions or comments? Use the comments section below.

Switch from dynamic IP to IP Pool in VMM 2016

Recently I migrated a VM from standalone Hyper-V server in 2016 to a Hyper-V cluster managed by Virtual Machine Manager (VMM) 2016. The VM was using DHCP for addressing and I wanted to flip it to get its address from the IP pool assigned to its logical network in VMM. However, even with the VM powered off, I found the setting to flip the VM to use a static IP was grayed out in the UI.

VM_Settings

How to fix? PowerShell, via the VMM cmdlets. Here is a sample script, intended to run on your VMM server.

$vm = Get-ScvirtualMachine -Name “<My VM Name>"

$staticIPPool = Get-SCStaticIPAddressPool -Name "< Name of my IP Pool>"

Grant-SCIPAddress -GrantToObjectType "VirtualNetworkAdapter" -GrantToObjectID$vm.VirtualNetworkAdapters[0].ID -StaticIPAddressPool $staticIPPool

Set-SCVirtualNetworkAdapter -VirtualNetworkAdapter $vm.VirtualNetworkAdapters[0]-IPv4AddressType static

<pre>

Questions or comments? Use the comments section below.