Optimized Group Maintenance Mode PowerShell for OpsMgr: 2012 R2 Edition

Below is the latest sample of optimized group maintenance mode for System Center Operations Manager (SCOM), which leverages the .ScheduleMaintenanceMode function, which allows for the inclusion of a request for recursion….the implementation of the “this object and contained objects”. This minimizes SDK calls and processing overhead for large groups.

Syntax:
GroupMaintMode -ScomServer “mmsscom01” -GroupDisplayName “SQL Server 2008 Computers” -DurationInMin 10 -Reason “ApplicationInstallation” -Comment “Scheduled weekly maintenance”

#--------Begin Sample Script----------------
#*****************************************************************************
#
# Name: OpsMgr 2012 Group Maintenance Mode 
#
# Description: Puts Groups into Maintenance Mode 
#
# Authors: Pete Zerger and Matthew Long  
#
# Parameters:
#
# -ScomServer: mandatory parameter containing mgmt server name (Netbios or FQDN)
# -GroupDisplayName: mandatory parameter containing display name of the target group
# -DurationInMin: mandatory parameter containing integer of desired duration in minutes
# -Reason: mandatory parameter containing reason. Acceptable values are UnplannedOther, 
#           PlannedHardwareMaintenance, UnplannedHardwareMaintenance,

#           PlannedHardwareInstallation, 
#           UnplannedHardwareInstallation, PlannedOperatingSystemReconfiguration, 
#           UnplannedOperatingSystemReconfiguration, PlannedApplicationMaintenance, 
#           ApplicationInstallation, ApplicationUnresponsive, ApplicationUnstable, 
#           SecurityIssue, LossOfNetworkConnectivity
# -Comment: optional parameter with free text description of your choice
#
#
#*****************************************************************************

Function GroupMaintMode 
#($ScomServer, $GroupDisplayName, $DurationInMin, $Reason, $Comment)
(
[Parameter(Mandatory=$true)][string]$ScomServer,
[Parameter(Mandatory=$true)][string]$GroupDisplayName,
[Parameter(Mandatory=$true)][Int32]$DurationInMin,
[Parameter(Mandatory=$true)][string]$Reason,
[Parameter(Mandatory=$false)][string]$Comment
){

Import-Module OperationsManager
New-SCOMManagementGroupConnection -ComputerName $ScomServer

ForEach ($Group in (Get-ScomGroup -DisplayName  $GroupDisplayName))
    {
   If ($group.InMaintenanceMode -eq $false)
         {
            $group.ScheduleMaintenanceMode([datetime]::Now.touniversaltime(), `
            ([datetime]::Now).addminutes($DurationInMin).touniversaltime(), `

             "$Reason", "$Comment" , "Recursive")
         }
    }

}

#Usage (calling the GroupMaintMode function)
GroupMaintMode -ScomServer "mmsscom01" -GroupDisplayName "SQL Server 2008 Computers" `
-DurationInMin 10  -Reason "ApplicationInstallation" -Comment "Scheduled weekly maintenance"

#--------End Sample Script----------------

#Usage (calling the GroupMaintMode function)
GroupMaintMode -ScomServer "mmsscom01" -GroupDisplayName "SQL Server 2008 Computers" `
-DurationInMin 10  -Reason "ApplicationInstallation" -Comment "Scheduled weekly maintenance"

#--------End Sample Script----------------

Questions or comments? Use the comments section below.

OpsMgr Overrides Report: 2012 R2 Edition

The following is latest tweak of my custom Overrides Report for System Center Operations Manager 2012 R2:

# ==============================================================================================
#
# NAME: OpsMgr Overrides Report
#
# ORGINAL AUTHOR: Daniele Muscetta and Pete Zerger
# ORGINAL DATE : 8/24/2010
#
# EDITED BY: Arjan Vroege
# EDITED DATA : 1/24/2014
#
# COMMENT: This report will output the overrides in your OpsMgr environment including
# override settings, target, source rule/monitor and source management pack.
# ==============================================================================================
#---Save the following text as script "Export-Overrides.ps1"

#Parameters
Param(
[Parameter(Mandatory=$true)] [string]$MS,
[Parameter(Mandatory=$true)] [string]$MP_Name,
[Parameter(Mandatory=$true)] [string]$output
)

#Connect to SCOM Environment
Import-Module OperationsManager
New-SCOMManagementGroupConnection -ComputerName $MS

#define the path you want to export the CSV files to
$exportpath = ""
$MPRpt = $null
$MPRows = $null

#gets all UNSEALED MAnagement PAcks
$mps = Get-SCOMManagementpack | where {$_.DisplayName -eq $MP_Name}

#loops thru them
foreach ($mp in $mps) {
$mpname = $mp.name
Write-Host "Exporting Overrides info for Management Pack: $mpname"

#array to hold all overrides for this MP
$MPRows = @()

#Gets the actual override objects
$overrides = $mp.GetOverrides()

#loops thru those overrides in order to extract information from them
foreach ($override in $overrides) {
#Prepares an object to hold the result
$obj = new-object System.Management.Automation.PSObject

#clear up variables from previous cycles.
$overrideName = $null
$overrideProperty = $null
$overrideValue = $null
$overrideContext = $null
$overrideContextInstance = $null
$overrideRuleMonitor = $null

#give proper values to variables for this cycle. this is what we can then output.
$name = $mp.name
$overrideName = $override.Name
$overrideProperty = $override.Property
$overrideValue = $override.Value

trap { $overrideContext = ""; continue } $overrideContext = $override.Context.GetElement().DisplayName
trap { $overrideContextInstance = ""; continue } $overrideContextInstance = (Get-SCOMClassInstance -Id $override.ContextInstance).DisplayName

if ($override.Monitor -ne $null) {
$overrideRuleMonitor = $override.Monitor.GetElement().DisplayName
} elseif ($override.Discovery -ne $null) {
$overrideRuleMonitor = $override.Discovery.GetElement().DisplayName
} else {
$overrideRuleMonitor = $override.Rule.GetElement().DisplayName
}

#fills the current object with those properties

#$obj = $obj | add-member -membertype NoteProperty -name overrideName -value $overrideName -passthru
$obj = $obj | add-member -membertype NoteProperty -name overrideProperty -value $overrideProperty -passthru
$obj = $obj | add-member -membertype NoteProperty -name overrideValue -value $overrideValue -passthru
$obj = $obj | add-member -membertype NoteProperty -name overrideContext -value $overrideContext -passthru
$obj = $obj | add-member -membertype NoteProperty -name overrideContextInstance -value $overrideContextInstance -passthru
$obj = $obj | add-member -membertype NoteProperty -name overrideRuleMonitor -value $overrideRuleMonitor -passthru
$obj = $obj | add-member -membertype NoteProperty -name MPName -value $name -passthru
$obj = $obj | add-member -membertype NoteProperty -name overrideName -value $overrideName -passthru

#adds this current override to the array
$MPRows = $MPRows + $obj
}

#Store up the overrides for all packs to a single variable
$MPRpt = $MPRpt + $MPRows
}

if($output -eq "csv") {
#exports cumulative list of overrides to a single CSV
$filename = $exportpath + "overrides" + $mp.name +".csv"
$MPRpt | Export-CSV -path $filename -notypeinfo
} elseif ($output -eq "html") {
$filename = $exportpath + "overrides" + $mp.name +".htm"
$MPRpt | ConvertTo-Html | Out-File $filename
}

Questions or comments? Use the comments section below.

Free E-book: Inside the Microsoft Operations Management Suite

Tao (@MrTaoYang), Stan (@StanZhelyazkov), Anders (http://contoso.se)  and I have been working on a project for the last few weeks. We wanted to bring a learning resource for the MS Operations Management Suite to the community that is complete, comprehensive, concise…and free (as in beer). While we finish final editing passes over the next couple of weeks, we wanted to share an early copy of the book so you can start digging in while we finish our work!

Description: This preview release of “Inside the Microsoft Operations Management Suite” is an end-to-end deep dive into the full range of Microsoft OMS features and functionality, complete with downloadable sample scripts (on Github). The chapter list in this edition is shown below:

  • Chapter 1: Introduction and Onboarding
  • Chapter 2: Searching and Presenting OMS Data
  • Chapter 3: Alert Management
  • Chapter 4: Configuration Assessment and Change Tracking
  • Chapter 5: Working with Performance Data
  • Chapter 6: Process Automation and Desired State Configuration
  • Chapter 7: Backup and Disaster Recovery
  • Chapter 8: Security Configuration and Event Analysis
  • Chapter 9: Analyzing Network Data
  • Chapter 10: Accessing OMS Data Programmatically
  • Chapter 11: Custom MP Authoring
  • Chapter 12: Cross Platform Management and Automation

This early edition is being shared with the community while final edits are being completed. Please send questions, comments or errata you find to insidemscloud@outlook.com.

You can download for free from the TechNet Gallery at:
https://gallery.technet.microsoft.com/Inside-the-Operations-2928e342 

 

Centralized Logging Strategy for Azure Automation Hybrid Worker with OMS

Many System Center Orchestrator administrators longed for centralized logging of Orchestrator runbook events and results. With no native centralized logging facility, many administrators used a simple SQL database as a centralized logging repository. While not ideal, this method is functional for a single data center.

Think about the challenges of implementing this strategy on a global basis. You would need either have to configure your Orchestrator runbook servers to log events to a SQL instance in a remote data center, or perhaps configure SQL replication to centralize the events after they were logged locally.

In this article, we will look at establishing a centralized runbook progress and result logging strategy for the Azure Automation Hybrid Runbook Worker using the Microsoft Operations Management Suite (OMS) that is quick, easy and searchable…no SQL database required.

In this post:

  • Hybrid Runbook Worker job logging in the Azure Portal
  • Native Logging on the Hybrid Worker
  • Why custom central logging for the Hybrid Worker?
  • Basic Custom Logging Implementation for Hybrid Worker
  • Logging runbooks events with a child runbook

Let us get down to business.

Hybrid Runbook Worker job logging in the Azure Portal

Enter the Azure Automation Hybrid Runbook Worker (hybrid worker) role, which enables Azure Automation customers to execute (run) runbooks on designated servers in their data centers around the world. The back-end infrastructure is Microsoft Azure, so implementing a global automation infrastructure is much less work than with Orchestrator.

However, the desire for detailed logging of activities on the hybrid runbook worker remains. Certainly the Jobs log (shown below) in Azure Automation shows results, inputs and outputs of runbook jobs. However, it does not provide detailed progress logging of your choosing at key points along the way.

Jobs_Log

With that in mind, the desire / need for centralized progress logging at our discretion remains.

Native runbook event logging on the Hybrid Worker

One bit of good news is that the hybrid worker includes some native logging to the Windows Event Log. You can find the log in the Windows Event Viewer under Applications and Services Logs > Microsoft-Automation > Operational.

More good news. You can easily add this log for collection by OMS in the Settings area on the Data tab. In the ‘Collect events from the following event logs’ window, type “Microsoft-Automation/Operational” and click on the plus sign (+) to add the log for collection. Then check the boxes for all three event levels.

HybWrk_Log_Collection

However, the bad news is, once this data arrives in our OMS subscription, you can see the formatting of the details in this log is less than ideal. Using the search interface, you can retrieve the contents of this log with the following query:

Type=Event EventLog=”Microsoft-Automation/Operational”

The result reveals that in the ParameterXml within the event lies the job status, but only the value (Succeeded, Failed, etc.) rather than a well-formatted name value pair we could use to extract the result into a custom field for better search. As a result, this native logging is not as useful as it could be, but can still serve to reduce the need for automation admins to use the Azure portal.

Moreover, this still does not address our need for ad-hoc progress logging hosted in a central repository.

Why custom central logging for the Hybrid Worker?

Even with the aforementioned logging, the need to log events within a runbook at key junctures remains, and this is where logging to the Windows Event Log with a custom source, with an event description detailing runbook progress or results, with output collected and forwarded to OMS. This gives us central logging without the need for a SQL database.

You can log this data to any local Windows Event Log, even the System log. One approach might be to use a couple of custom sources, such as one for logging runbook progress messages and another reporting the runbook result, both in an easily searchable format in OMS once the data has been collected and uploaded.

Basic Custom Logging Implementation for Hybrid Worker

I will describe a couple of simple approaches you could take for custom logging locally on the hybrid worker.

First, in order to write messages with a custom Windows event log source, the providers must exist. You can create custom event sources with the New-EventLog cmdlet. If you wanted, you could simply create these providers each time you deploy a new hybrid worker.

For runbook progress messages, I will write to the System event log with a source of OMSAutomation-Status. To create this new provider, simply run the following one-liner on the hybrid worker.

New-EventLog -LogName System -Source OMSAutomation-Status

For the final runbook result message, I will again write to the System log with a source of OMSAutomation-Result. To create this new provider, simply run the following one-liner on the hybrid worker.

New-EventLog -LogName System -Source OMSAutomation-Result

This approach ensures I can quickly and easily search runbook messages by event source. Then, any time you want to log a status message within your runbook, you would simply add a Write-EventLog message at the appropriate point in your runbook. Since the System log is already being collected in OMS, you do not have to configure any additional log collection in OMS.

Let’s assume that every time a runbook begins I log a message in the following format from within my PowerShell runbook on hybrid workers:

Write-EventLog -EventId 1 -LogName System -Message "Runbook $RBName is now beginning" -Source OMSAutomation-Status

Then in OMS I can find every message showing a runbook beginning with the following wildcard search:

Type=Event EventLog=System Source=OMSAutomation-Status “*now beginning “

Add | measure count() by Source and you can get a count of how many runbooks have run on hybrid workers.

Type=Event EventLog=System Source=OMSAutomation-Status “*now beginning ” | measure count() by Source

Add the computer name to filter results to a specific hybrid worker, as shown here

Type=Event EventLog=System Source=OMSAutomation-Status “*now beginning ” Computer=”VMBR01.contoso.corp” | measure count() by Source

If I wanted to find ever event where a specific runbook ran, I would simply expand the search based on my message syntax, for example.

Type=Event EventLog=System Source=OMSAutomation-Status “Runbook Hello-World is now beginning”

And finally, a search for all successful completions of a specific runbook

Type=Event EventLog=System Source=OMSAutomation-Result “Runbook Hello-World completed successfully” Computer=”VMBR01.contoso.corp” Source=OMSAutomation-Result

or any runbook, as pictured below

Type=Event EventLog=System Source=OMSAutomation-Result “*successfully” Computer=”VMBR01.contoso.corp” Source=OMSAutomation-Result

JobResult

As you can see, logging inline from within your runbook is simple, quick, easy and searchable in OMS!

Logging runbooks events with a child runbook

We could take what some might deem a slightly more elegant approach and instead call a child runbook to log the event. One advantage of this method is that we can accept the event source as an input, and check that the source exists before attempting to write the event. This means there would be no need to create the custom event sources manually. On the downside, this approach might mean slightly longer execution time, resulting in billing for a few more runbook minutes per month versus the simple inline method described above.

Here is a simple, generic and effective Windows event logging script borrowed from my friend Tao Yang. This script accepts event log name, ID, source, type and message as parameters. Before attempting to log the event, it checks for your event source on the hybrid worker. If the source does not exist, it creates the new source, then logs the event.

Param (
 [Parameter(Mandatory=$true)][string]$EventLog,
 [parameter(Mandatory=$true)][int]$EventID,
 [Parameter(Mandatory=$true)][string]$EventSource,
 [Parameter(Mandatory=$true)][string]$EntryType,
 [Parameter(Mandatory=$true)][string]$Message
)

$ExistingEventSource = Get-Eventlog -LogName $EventLog -Source $EventSource -ErrorAction SilentlyContinue
If (!$ExistingEventSource)
{
 #The event source does not exist, create it now
 New-EventLog -LogName $EventLog -Source $EventSource
}
Write-EventLog -LogName $EventLog -source $EventSource -EventId $EventID -EntryType $EntryType -Message $Message

Here is a sample usage of this runbook you can past into your Azure Automation runbooks destined for hybrid workers. This assumes you have added the above sample runbook to your Azure Automation subscription with the name Write-WindowsEvent.

.\Write-WindowsEvent.ps1 -Message 'Test message' -EntryType 'Information' -EventLog 'System' -EventSource OMSAutomation -EventID 50001

Conclusion

As you can see, Azure Automation, the new hybrid worker and OMS open the door to easy and flexible implementation of a global, hybrid runbook automation infrastructure that transcends the limitations of System Center Orchestrator. Get started with your free subscription to OMS at http://microsoft.com/oms.

How to Query Azure SQL Databases via PowerShell with SMO

In previous articles, I have demonstrated how to create Azure SQL Server and database instances via PowerShell in:

Certainly you could create both your Azure SQL Server instance and the database in the Azure Portal, so do as you prefer. In this post, I will ultimately give you the steps to connect to Azure SQL and work with a bit with the dataset returned.

Why SMO? We’re going to use Shared Management Objects (SMO) in this example, which is a convenient approach because really only requires that SMO (and its dependencies) are installed where you run the PowerShell script. You don’t need any specialized SQL Server PowerShell modules or management tools.

Prerequisites

In order to connect, you will need to ensure the following is true on the machine you are working from:

TIP: Make sure you load the CLR Types first, as SMO requires CLR in order to install successfully. Both of these components are on the same page on the MS site (http://www.microsoft.com/en-us/download/details.aspx?id=29065)

Firewall Rule

Another prerequisite is a firewall rule on your Azure SQL Server. Azure SQL blocks all remote connections by default, so you have to create a firewall rule. The Azure Portal will show you what it sees as your current IP, making it easy to add an exclusion for yourself.

To create an exclusion for all IPs, simply enter a range of 0.0.0.0 – 255.255.255.255. Obviously this is not the most secure option, so open up only to the source IPs that really require.

Azure SQL

How to create firewall rules for Azure SQL is also available at http://blogs.msdn.com/b/umits/archive/2012/11/20/windows-azure-sql-database-how-to-manage-your-firewall-rules-with-windows-azure-management-portal.aspx.

Sample Script

The sample code here will run whatever T-SQL SELECT, INSERT, UPDATE or DELETE query you assign to the $sql variable.

# Connect to SQL

[Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")

$s = new-object ('Microsoft.SqlServer.Management.Smo.Server') ' tcp:mySQLServer.windows.net'

$s.ConnectionContext.LoginSecure = $false

$s.ConnectionContext.Login="myuser"

$s.ConnectionContext.Password="P@ssw0rd1!"

# Set DB

$db = $s.Databases["MySQLDB"]

# Build SQL Query (SELECT, INSERT, UPDATE or DELETE)

$sql = 'SELECT * FROM MySQLTable'

#Submit SQL Query (sample 1)

$ds = $db.ExecuteWithResults($sql).Tables[0]

At this point, your dataset is stored in the $ds variable.

If you wanted simply to dump your dataset to a spreadsheet, you could do change the line with $ds to the following:


#Submit SQL Query and write results to CSV (sample 2)
$OutputFile = 'c:\mydir\myfile.csv'
$ds = $db.ExecuteWithResults($sql).Tables[0]  | Export-Csv "$OutputFile" -NoTypeInformation

If you wanted to loop through a multi-row dataset and write to your console, flip the last section to this snippet:

</pre>
#Loop through results and write to console (sample 3)
Foreach ($t in $ds.Tables)
{
   Foreach ($r in $t.Rows)
   {
      Foreach ($c in $t.Columns)
      {
          Write-Host $c.ColumnName "=" $r.Item($c)
      }
   }
}

Hopefully this gives you a nice jump start. I did not mention stored procedures, which I’ll cover in a quick post in the near future.

How to stop and start Azure VMs in bulk

Starting and stopping Azure VMs in bulk is pretty easy in PowerShell, but you want to filter your VMs based on the correct property. VMs have a Status property that includes StoppedDeallocated, Stopped, RoleReady, etc. If you are attempting to identify VMs based on status, your PowerShell will have to write your script to filter on multiple criteria.

A better option is to use the PowerState property. Your VMs are going to have a PowerState of “Started” or “Stopped”, so turning them on (or off) in bulk based on this property is quite simple. You could take these one-liners into simple Windows scheduled tasks to turn your development and test environments down after quitting time and bring them back online before the team gets into the office the next morning, reducing your Azure spend on non-production compute resources.

Below are a handful of one-liners and a description of the nuances of each.

This first one-liner will shut down every VM in your subscription, but you will be prompted on the last / only VM in a cloud service with an “continuing will result in a new IP address for your deployment” message, and further direction that using the -StayProvisioned parameter will ensure your VM will keep it’s IP address. However, when you use that parameter, the VM remains provisioned, so you also keep getting billed. For test and dev environments, dynamic DNS registration and short time-to-live (TTL) on DNS records should be good enough to allow you to simply shutdown and deallocate to save dollars.

Get-AzureVM |  Where-Object {$_.powerstate -eq ‘Started’} | Stop-AzureVM

If you add the -Force parameter to the end of the one-liner (as shown below), the VMs are stopped and deallocated (meaning you are only being billed for the storage they consume) without the prompt.

Get-AzureVM |  Where-Object {$_.powerstate -eq ‘Started’} | Stop-AzureVM -Force

Replace the -Force parameter with -StayProvisioned as shown below and the annoying prompt is still avoided, but your VMs wind up in with a status of “Stopped” instead of “StoppedDeallocated”. This means the VMs will come back online with the same IP address, but the billing does not stop between power off and the next power on.

Get-AzureVM |  Where-Object {$_.powerstate -eq ‘Started’} | Stop-AzureVM -StayProvisioned

To start the VMs up again, you can simply reverse the flow with a couple of small changes, as shown below.

Get-AzureVM |  Where-Object {$_.powerstate -eq ‘Stopped’} | Start-AzureVM

Naturally, you can add multiple criteria to the Where-Object filtering to filter based on VM name, cloud service name and other properties to shut down specific subsets of your VMs. In a future post we will look at scheduled tasks, time-based actions and moving this process into Azure’s PowerShell Workflow based orchestration engine, Azure Automation.

How to find if an Azure VM exists via PowerShell

One task that can be help to optimize deployments in Azure is checking for the existence of a VM before you attempt to deploy. This can be particularly useful when you attempt to restart a failed deployment of multiple VMs. Because every VM in Azure resides within a cloud service, checking for a VM can be a bit more work than in an on-premises scenario with VMM and Hyper-V.

If you know the name of the service the VM should reside within your Azure subscription, the check is quick and easy.

if (!(Get-AzureVM -ServiceName "MyCloudService" -Name "MyVMName"))

On the other hand, sometimes you will not know which cloud service the VM resides in (as when generating random cloud service names in a bulk deployment). In this case, you will need to check every cloud service for the existence of the VM. The function below is one example of to determine if a VM exists within an Azure subscription if you do not know the name of the cloud service. The last line calls the function and provides the name of  a VM.

 Function Check-AllServices4VM ($vmName) {

 $cloudServices = Get-AzureService 

 ForEach ($Service in $cloudServices) {

 If (Get-AzureVM -ServiceName $Service.ServiceName -Name $vmName) {

 Write-Host "Azure VM by that name already exists in service " $Service.ServiceName " ...terminating"
 Break
      }
     Else
      {
    Write-Host "Azure VM does not exist is service " $Service.ServiceName
      }
   }
 } # end of function 

 # Call the function and provide VM name
 Check-AllServices4VM -vmName SQLClient

That’s all for today. Leave any thoughts in comments.

How to Create an Azure SQL Database via PowerShell

As I mentioned yesterday in “How to Create an Azure SQL Instance via PowerShell“, Azure SQL provides some cost advantages over running SQL in Azure VMs for scenarios where all you need is a database. In yesterday’s article, we created an Azure SQL database instance, as well as a firewall rule on the instance to allow remote connections.

Now, we are ready to create a database in our new Azure SQL instance. Start by connecting to your Azure subscription in the standard manner (see resources in the article linked above if connecting to your Azure sub in PowerShell is not familiar).

Then, we create a credential to store the user and password we will use to authenticate to the Azure SQL instance.

$User = “pzerger”
$PWord = ConvertTo-SecureString –String ‘Pa$$w0rd!’ –AsPlainText -Force
$Credential = New-Object –TypeName System.Management.Automation.PSCredential `
–ArgumentList $User, $PWord
[/code]

Next, we will connect and authenticate to the instance.

$sqlContext = New-AzureSqlDatabaseServerContext `
-ServerName “xne7lwczta" -Credential $Credential

Then, we will select the level of service, which determines the max database size, connection and session count, and transaction rate. There are many levels of service across the Basic, Standard and Premium tiers. You can find a current list of available  https://msdn.microsoft.com/en-us/library/azure/dn741336.aspx.

In this example, I will specify S0, the lowest level of the Standard tier, and provide along with that the sqlContext object from the previous step, which stores the SQL instance name and credentials.

$S0 = Get-AzureSqlDatabaseServiceObjective `
-Context $sqlContext -ServiceObjectiveName S0

In the last step, we create the database. In this case, I am creating a database named “InsideTheMSCloud”.

New-AzureSqlDatabase -ConnectionContext $sqlContext `
-DatabaseName “InsideTheMSCloud" -Edition “Standard” -MaxSizeGB 1 `
-Collation “SQL_Latin1_General_CP1_CI_AS” -ServiceObjective $S0

Now we have a Azure (PaaS) SQL instance, a firewall rule to allow remote connections and a database. In the next post on this topic, we’ll do something with this database.

How to Create an Azure SQL Instance via PowerShell

I wanted to share a bit about Azure SQL (PaaS) in Azure, as I believe it will commonly used by IT Pros in the future, particularly as a logging and reference databasew when working with Azure Automation. One compelling reason to Azure SQL instance of SQL Server in Azure VMs is cost. If you just need a little database, you can have what you need in Azure SQL for as little as free or a few cents a month. SQL Server in Azure VMs is going to cost the price of the VM and the SQL license if you don’t bring your own. You can read about the differences between the two at http://azure.microsoft.com/en-us/documentation/articles/data-management-azure-sql-database-and-sql-server-iaas/.

To create an Azure SQL Server instance, start by connecting to your Azure subscription in PowerShell as you normally do. If you are unsure of the steps, see http://azure.microsoft.com/en-us/documentation/articles/powershell-install-configure/.

Next, we create an Azure SQL instance as follows. This creates a new Azure SQL Server instance, with a random name chosen by Azure.

#Create Azure SQL Server
$location = 'East US'
$login = 'pzerger'
$password = 'Pa$$w0rd!'
$newSqlServer = New-AzureSqlDatabaseServer -Location $location -AdministratorLogin $login `
-AdministratorLoginPassword $password -Version "2.0"

Get the name of the Azure SQL instance name like so:

$newSqlServer.ServerName

All connections to this SQL instance will be blocked by default, so you need to create a firewall rule to allow all connections. For this one, I just create a rule to allow connections from any IP address.

New-AzureSqlDatabaseServerFirewallRule -ServerName $newSQLServer.ServerName `
-RuleName "Allow all IP addresses to connect" `
-StartIpAddress 0.0.0.0 -EndIpAddress 255.255.255.255

Now we have an Azure SQL Server instance that allows connections from any IP Address. In a future post, we will connect and create a database and start making use of our resources.

Retrieve the latest version of a Gallery Image in Azure PowerShell

Here is a handy snippet I use fairly often. When you retrieve an Azure gallery image by name, you will find there are multiple versions of the image.

Get-AzureVMImage -Verbose:$false | Where-Object {$_.label -like “Windows Server 2012 Datacenter*”} | Select-Object Label, PublishedDate

You will typically find at least two versions of each image, which can be differentiated by date, as shown below.

Label       PublishedDate
—–         ————-
Windows Server 2012 Datacenter, February 2015 2/11/2015 2:00:00 AM
Windows Server 2012 Datacenter, March 2015 3/12/2015 2:00:00 AM

Finding the latest image is as simple as retrieving all of them and sorting in descending order based on the published date. The code below will arrange the images in descending order by published date, then retrieve the first in the list.

(Get-AzureVMImage -Verbose:$false | Where-Object {$_.label -like “Windows Server 2012 Datacenter*”}| Sort-Object –Descending PublishedDate)[0].ImageName

We can save this result to a variable and proceed with deployment.

$winImageName = (Get-AzureVMImage | where {$_.label –like “Windows Server 2012 Datacenter*”} | sort PublishedDate -Descending)[0].ImageName

I’ll follow this post with a many more in this vein, offering some tips to optimize your Azure scripting.