reportingservices

Create SSRS Data Source

Continuing on with my earlier discussion on SSRS and testing data sources.  I’ve come up with a means to create a data source.  This article will demonstrate how I did that.

In testing the data source I had to create a proxy to the SSRS server again we’ll need to do the same thing so we can get to the Create method for the Data source.

$reportWebService = 'http://yourReportServer/ReportServer/ReportService2010.asmx'
$credentials = Get-Credential
$reportproxy = New-WebServiceProxy -uri $reportWebService -Credential $credentials

The reportWebService is a link to my Webservice on my ssrs instance which when proxied will allow me to get at all the methods and properties of this class Reportservice2010.asmx

The method we’ll be using for this discussion is ReportingService2010.CreateDataSource.

This method requires three variables.

[string] DataSource,
[string] Parent,
[boolean]Overwrite,
[DataSourceDefinition] Definition,
[Property[] ]Properties

The Datasource is a String = The name for the data source including the file name and, in SharePoint mode, the extension (.rsds).

Parent = The fully qualified URL for the parent folder that will contain the data source.  In My case I’m going to use /ThomTest

Where the location from root on the SSRS server is the folder Named ThomTest.

Overwrite = This tells the function if it finds it to overwrite what is there.

DataSourceDefition = This is a DataSourceDefinition class that contains the values for the DataSource. This includes things like:

ConnectStringCredentialRetrieval, Enabled, EnabledSpecified ImpersonateUserImpersonateUserSpecifiedPasswordPrompt, UserName, WindowsCredentials

For each of the above properties here is what I’ve been able to discover so far for where they are used:

2016-08-03 16_08_38-Clipboard

[Property[] ]Properties =  ReportService2010.Property[]– an array of properties that are nearly the same thing as the data source definition. So some of the same data collected to create the data source definition is used in this property Array collection.

The tough part of this creation of the datasource was getting the values passed into the PowerShell function to be accepted by the proxied method.  In order to do this I stumbled on this great article on StackOverflow. This allowed me to get at the classes from the proxied webservice via calls similar to the one below:

$ssrsproxy = New-SSRSProxy -reportWebService $reportWebService `
-Credentials $credentials
 $proxyNameSpace = $ssrsproxy.gettype().Namespace

So in order to get to the class I need for the DataSourceDefinition .  All i need to do is take the ProxyName space and append it to the proxied name space.

$proxyNameSpace = $ssrsproxy.gettype().Namespace 
$datasourceDef = New-Object("$proxyNameSpace.DataSourceDefinition")

Now my $datasourceDef is a DatasourceDefinition object which contains the properties I showed above.  Since it is now in an object all I need to do now to set the items I need is to refer to them via . notation:

$datasourceDef = New-Object("$proxyNameSpace.DataSourceDefinition")
PS PS:\> $datasourceDef


Extension : 
ConnectString : 
UseOriginalConnectString : False
OriginalConnectStringExpressionBased : False
CredentialRetrieval : Prompt
WindowsCredentials : False
ImpersonateUser : False
ImpersonateUserSpecified : False
Prompt : 
UserName : 
Password : 
Enabled : False
EnabledSpecified : False

PS PS:\> $datasourcedef.Connectstring = 'MyConnectionSTring'

Ok now the fourth parameter is the tough one this is where I had to get help from @Poshoholic on how to get a hashtable for the values into a Array of properties that the create will accept.

Here is what the Hashtable looks like:

 PS PS:\> $datasourceDefHash = @{
 'ConnectString' = $connectString; 'UserName' = $username; `
'Password' = $password; 'WindowsCredentials' = $windowsCredentials; `
'Enabled' = $enabled; 'Extension' = $Extension; `
'ImpersonateUser' = $ImpersonateUser; 'ImpersonateUserSpecified' = $true; `
'CredentialRetrieval' = $credentialRetrieval
 }

My understanding of what is needed is a property Collection so I named my variable a property collection:

 $propertyCollection = $datasourceDefHash.Keys.foreach`
{ @{ Name = $_; Value = $dataSourceDefHash[$_] }`
 -as "${proxyNamespace}.property" }

The magic here is where we are iterating through our keys and then casting each name and value to the $proxynamespace.property which is our ReportService2010.Property[] array.  @Poshoholic informed that because the name of the class is dynamic we have to use the -as key word to allow it to be ‘cast’ into the property we need.  Wow I’m glad he helped me or I’d have been here a very long time.

Now to put it all together. I originally wrote this function to all for continuous deployments and creation of data sources. The only value I really wanted to use was the Storing of the username and password (Credentials stored securely in the report server). In addition I need the checkbox for this option checked ( Use as Windows credentials when connecting to the data source).  with the Username and password entered upon calling the function.

So here is what my param block looks like:

  param
 (
 [Parameter(Mandatory = $false)]
 [string]$DataSourceName,
 [string]$path,
 [Parameter(Mandatory = $false)]
 [uri]$reportWebService,
 [string]$connectString,
 [string]$password,
 [string]$username,
 [ValidateSet('SQL','SQLAZURE','OLEDB','OLEDB-MD','ORACLE','ODBC','XML',`
'SHAREPOINTLIST','SAPBW','ESSBASE','Report Server FileShare','NULL'`
,'WORDOPENXML','WORD','IMAGE','RPL','EXCELOPENXML','EXCEL','MHTML',`
'HTML4.0','RGDI','PDF','ATOM','CSV','NULL','XML')]
 [string]$Extension = 'SQL',
 [boolean]$windowsCredentials = $false,
 [boolean]$enabled = $true,
 [boolean]$ImpersonateUser = $false ,
 [ValidateSet('None', 'Prompt', 'Integrated', 'Store')]
 [string]$credentialRetrieval = 'Store',
 [System.Management.Automation.PSCredential]$credentials
 )

Now that I have my user passing in their credentials and the items I need with the default values I can now call some of the methods and Items I described above:

  #https://msdn.microsoft.com/en-us/library/reportservice2010.reportingservice2010.createdatasource.aspx
 $ssrsproxy = New-SSRSProxy -reportWebService $reportWebService -Credentials $credentials
 $proxyNameSpace = $ssrsproxy.gettype().Namespace
 #https://msdn.microsoft.com/en-us/library/reportservice2010.datasourcedefinition.aspx
 $datasourceDef = New-Object("$proxyNameSpace.DataSourceDefinition") #definition is needed because the create expects and object with some of the properties set.
 #$dataSourceProps = New-Object ("$proxyNameSpace.property")
 #$ssrsExtensions = ($ssrsproxy.ListExtensions('All')).name `
 #-join "','" for creating the set statement for extensions.
 #for some reason you have to set the extension and datasouce `
 in the definition before attempting to create. 
 $datasourceDef.connectstring = $connectString
 $datasourcedef.Extension = $Extension
 if ($credentialRetrieval -eq 'Store')
 {
 $datasourceDef.WindowsCredentials = $WindowsCredentials
 $datasourceDef.password = $password
 $datasourceDef.CredentialRetrieval = $credentialRetrieval
 $datasourceDef.username = $username
 }
 $datasourceDefHash = @{
 'ConnectString' = $connectString; 'UserName' = $username; 'Password' = $password; 'WindowsCredentials' = $windowsCredentials; 'Enabled' = $enabled; 'Extension' = $Extension; 'ImpersonateUser' = $ImpersonateUser; 'ImpersonateUserSpecified' = $true; 'CredentialRetrieval' = $credentialRetrieval
 }
 #convert the hashtable to an array of proxynamespace property items. https://msdn.microsoft.com/en-us/library/reportservice2010.property.aspx
 $propertyCollection = $datasourceDefHash.Keys.foreach`
{ @{ Name = $_; Value = $dataSourceDefHash[$_] } -as "${proxyNamespace}.property" }
 try
 {
 $ssrsproxy.CreateDataSource($DataSourceName, $path, $true, $datasourceDef, $propertyCollection)
 }
 catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }

The actual piece that is doing the creation of the data source is this line

$ssrsproxy.CreateDataSource($DataSourceName, $path, $true, $datasourceDef, `
$propertyCollection)

The script in its full form is below:

<#
 .SYNOPSIS
 Creates an SSRS data source
 
 .DESCRIPTION
 This script creates a datasource from the PowerShell prompt.
 
 .PARAMETER DataSourceName
 A description of the DataSourceName parameter.
 
 .PARAMETER path
 Path to where the datasource will be created. This should be the root of where the source is created.
 /report/report data source will be created at the second report value.
 
 .PARAMETER reportWebService
 URI to the location of the reportingService 2010 asmx page.
 
 .PARAMETER connectString
 This is the connection string that you use to connect to your database.
 
 .PARAMETER password
 Password to use if you are storing the credentials on the SQL server.
 
 .PARAMETER UserName
 Username to use for the connection if you are storing the credentiasl on the SQL Server.
 
 .PARAMETER Extension
 The Extension parameter is described as the Data Source Type in the new data source window in SSRS. Depending on your installation you may or may not have the items specified in the set statement for this function:
 'SQL' = SQL Server Connection
 'SQLAZURE' = SQL Azure Connection
 'OLEDB' = OLEDB connection 
 other possible connections include: 'OLEDB-MD','ORACLE','ODBC','XML','SHAREPOINTLIST','SAPBW','ESSBASE','Report Server FileShare','NULL','WORDOPENXML','WORD','IMAGE','RPL','EXCELOPENXML','EXCEL','MHTML','HTML4.0','RGDI','PDF','ATOM','CSV','NULL','XML'
 
 .PARAMETER windowsCredentials
 windowsCredentials = When using 'Store' with credential retrieval this sets the data source to 'Use as Windows credentials when connecting to the data source' 
 
 .PARAMETER enabled
 This Tells SSRS to enable the data source.
 
 .PARAMETER ImpersonateUser
 SEt this to true if you want to use the 'Impersonate the authenticated user after a connection has been made to the data source'.
 
 .PARAMETER credentialRetrieval
 CredentialRetrieval = one of four values:
 None = Credentials are not required
 Store = Credentials stored securely in the report server
 requires setting the username and password and optional params are impersonate and windowsCredentials
 Prompt = Credentials supplied by the user running the report
 Integrated = Windows integrated security
 
 .PARAMETER Credentials
 The credentials parameter is required to access the web service. They should be [System.Management.Automation.PSCredential] type
 
 .PARAMETER WebService
 This is the url to the Webservice which allows for creation of 
 
 .EXAMPLE
 PS C:\> $reportWebService = 'http://mySSRSServer//reportserver/reportservice2010.asmx'
 PS C:\> New-SSRSDataSource -DataSourceName 'ThomTest' -path '/ThomTest' -reportWebService $ReportWebService -connectString 'Data Source=servername;Initial Catalog=DB;Integrated Security=True' -username 'domain\user' -password 'password' -Extension SQL -enabled $true -windowsCredentials $true -credentialRetrieval Store -impersonateuser $true -credentials $credentials
 
 .NOTES
 Additional information about the function.
#>
function New-SSRSDataSource
{
 [CmdletBinding()]
 param
 (
 [Parameter(Mandatory = $false)]
 [string]$DataSourceName,
 [string]$path,
 [Parameter(Mandatory = $false)]
 [uri]$reportWebService,
 [string]$connectString,
 [string]$password,
 [string]$username,
 [ValidateSet('SQL','SQLAZURE','OLEDB','OLEDB-MD','ORACLE','ODBC','XML','SHAREPOINTLIST','SAPBW','ESSBASE','Report Server FileShare','NULL','WORDOPENXML','WORD','IMAGE','RPL','EXCELOPENXML','EXCEL','MHTML','HTML4.0','RGDI','PDF','ATOM','CSV','NULL','XML')]
 [string]$Extension = 'SQL',
 [boolean]$windowsCredentials = $false,
 [boolean]$enabled = $true,
 [boolean]$ImpersonateUser = $false ,
 [ValidateSet('None', 'Prompt', 'Integrated', 'Store')]
 [string]$credentialRetrieval = 'Store',
 [System.Management.Automation.PSCredential]$credentials
 )
 #https://msdn.microsoft.com/en-us/library/reportservice2010.reportingservice2010.createdatasource.aspx
 $ssrsproxy = New-SSRSProxy -reportWebService $reportWebService -Credentials $credentials
 $proxyNameSpace = $ssrsproxy.gettype().Namespace
 #https://msdn.microsoft.com/en-us/library/reportservice2010.datasourcedefinition.aspx
 $datasourceDef = New-Object("$proxyNameSpace.DataSourceDefinition") #definition is needed because the create expects and object with some of the properties set.
 #$dataSourceProps = New-Object ("$proxyNameSpace.property")
 #$ssrsExtensions = ($ssrsproxy.ListExtensions('All')).name #-join "','" for creating the set statement for extensions.
 #for some reason you have to set the extension and datasouce in the definition before attempting to create. 
 $datasourceDef.connectstring = $connectString
 $datasourcedef.Extension = $Extension
 if ($credentialRetrieval -eq 'Store')
 {
 $datasourceDef.WindowsCredentials = $WindowsCredentials
 $datasourceDef.password = $password
 $datasourceDef.CredentialRetrieval = $credentialRetrieval
 $datasourceDef.username = $username
 }
 $datasourceDefHash = @{
 'ConnectString' = $connectString; 'UserName' = $username; 'Password' = $password; 'WindowsCredentials' = $windowsCredentials; 'Enabled' = $enabled; 'Extension' = $Extension; 'ImpersonateUser' = $ImpersonateUser; 'ImpersonateUserSpecified' = $true; 'CredentialRetrieval' = $credentialRetrieval
 }
 #convert the hashtable to an array of proxynamespace property items. https://msdn.microsoft.com/en-us/library/reportservice2010.property.aspx
 $propertyCollection = $datasourceDefHash.Keys.foreach{ @{ Name = $_; Value = $dataSourceDefHash[$_] } -as "${proxyNamespace}.property" }
 try
 {
 $ssrsproxy.CreateDataSource($DataSourceName, $path, $true, $datasourceDef, $propertyCollection)
 }
 catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }

}

function New-SSRSProxy
{
 param
 (
 [string]$reportWebService,
 [Parameter(Mandatory = $true,
 ValueFromPipeline = $true,
 ValueFromPipelineByPropertyName = $true)]
 [System.Management.Automation.PSCredential]$Credentials
 )
 Begin
 {
 if ($reportWebService -notmatch 'asmx')
 {
 $reportWebService = "$reportWebService/ReportService2010.asmx?WSDL"
 #$reportWebServiceurl = $reportWebServiceUrl.Replace("//","/")
 }
 }
 Process
 {
 #Create Proxy
 Write-Verbose "Creating Proxy, connecting to : $reportWebService"
 $ssrsProxy = New-WebServiceProxy -Uri $reportWebService -UseDefaultCredential -ErrorAction 0
 #Test that we're connected
 $members = $ssrsProxy | get-member -ErrorAction 0
 if (!($members))
 {
 if (!$Credentials)
 {
 $Credentials = Get-Credential -Message 'Enter credentials for the SSRS web service'
 }
 Else
 {
 }
 $ssrsProxy = New-WebServiceProxy -Uri $reportWebService -Credential $Credentials
 }
 $ssrsProxy
 }
 End { }
}

I hope this helps someone

Until then keep scripting

 

Thom

reportingservices

Testing an SSRS DataSource

Recently I’ve been working with SSRS and adding removing Datasources and Testing them. This article describes how I got there:

The first thing I did was to figure out how I was going to connect to the SSRS website.  I found that the best way to do this was with a new-webserviceproxy. This proxy requires a uri and a credential object:

$reportWebService = 'http://yourReportServer/ReportServer/ReportService2010.asmx'
$credentials = Get-Credential
$reportproxy = New-WebServiceProxy -uri $reportWebService -Credential $credentials

The reportWebService is a link to my Webservice on my ssrs instance which when proxied will allow me to get at all the methods and properties of this class Reportservice2010.asmx

Now the next thing I had to figure out is if the value that I’ll pass for my $datasource is actually a Data source.  So after sleuthing around I found the method to be able to determine the type that the passed in text is for my datasource.  The name of the method is getitemType if I pass the value for my $datasource to this method it’ll return me what type of “thing” it is. So using the report proxy above I’m going to see what type of return I’ll get.

$datasource = '/ssrs/mydatasource'
$reportproxy.GetItemType($datasource)   
DataSource

To find the different types that are possible running the Listitemtypes() against the report proxy will give you the different types that are possible.

PS C:\> $reportproxy.listitemtypes() 
Unknown 
Folder 
Report 
Resource 
LinkedReport 
DataSource 
Model 
Site 
DataSet 
Component

AS you can see there are several different item types.  The one we want to test with is the DataSource.  Now that we know that the use passed us a Data source we can now see if we have the ability to view it.  This is done with the GetDataSourceContents method.

$validObject = $reportProxy.Getdatasourcecontents($datasource)
Extension : SQL 
ConnectString : Data Source=SomeServer;Initial Catalog=dsn1 
UseOriginalConnectString : False 
OriginalConnectStringExpressionBased : False 
CredentialRetrieval : Store 
WindowsCredentials : True 
ImpersonateUser : False 
ImpersonateUserSpecified : True 
Prompt : Type or enter a user name and password to access the data source: 
UserName : yourDomain\ServiceAccount
Password : 
Enabled : True 
EnabledSpecified : True

So now that we know we can get to the Data source now we can go through testing it. This is done by calling the following method TestConnectForItemDataSource To do this the method requires 4 parameters and this is where i spent a great deal of time trying to figure out what the right params to pass were.  Three of them are in the $Validobject variable.  The third one is a [ref] type that I had not used before.  So here is how I was able to do this:

$tempRef = $true # have to initialize a variable so it can be used as a reference
# in the next method call
 $validConnect = $reportproxy.TestConnectForItemDataSource `
($datasource, $datasource, ($validObject.username), `
($validObject.password), ([ref]$tempRef))

The first variable $datasource is our datasource. The second is the datasource again.  This tripped me up because I was thinking i’d have to parse the datasource name out of the fully qualified datasource. I tried passing the name of the data source or the full path to the datasource and both seemed to work equally well so I took the less work method datasource,datasource.

The third param is the user name. Fourth is the password which when you look at the variable in shell it appears $null. If you try to get the contents nearly every query comes back with errors on a null valued expression.  Well I carried on here and assumed that the server had the password because  I can test in the GUI and it works fine.

Now onto the fifth param. I first passed a blank variable here and I would get errors:

Argument: ‘4’ should be a System.Management.Automation.PSReference. Use [ref].

I then ran this same function with [ref]$temp and I found that it would error because it didn’t exist yet thanx to Joel Bennet I was able to figure it out

At line:1 char:1
+ $reportproxy.TestConnectForItemDataSource($datasource, $datasource, ( …
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (tempRef:VariablePath) [], RuntimeException
+ FullyQualifiedErrorId : NonExistingVariableReference

So I declared the variable and then I was able to succesfully test the connection:

$validConnect = $reportproxy.TestConnectForItemDataSource`
($datasource, $datasource, ($validObject.username), `
($validObject.password), ([ref]$tempRef))
 
True

Now that I have a return of true or false I can add the test value and datasource to my $validobject variable:

 $validObject | Add-Member -type NoteProperty -Name 'Valid' -Value $validConnect
 $validObject | Add-Member -Type NoteProperty -Name 'DataSource' -Value $datasource
 [pscustomobject]$validObject

Now that is done I can now show you the entire function with all the items added above + another function to do a litte house keeping on the Passed in data source.  I have more to this set of functions on a gist script in my repo called SQLReporting.ps1  https://gist.github.com/crshnbrn66/

 

<#
 .SYNOPSIS
 Test to see if the Datasource passed is valid
 
 .DESCRIPTION
 Checks to see if the data source exists. Then will test the data source to see if it is operational
 
 .PARAMETER datasource
 This is the full path to the datasource. where in the example datasource1 is our datasource name.
 for example: \myapp\mydatasource\data sources\datasource1
 example 2: \datasource1
 
 .PARAMETER reportWebService
 A description of the reportWebService parameter.
 
 .PARAMETER Credentials
 You must pass a pscredential object
 
 .PARAMETER NoTest
 if this is passed a test will not be performed on the data source.
 
 .NOTES
 Additional information about the function.
#>
function Test-ValidDataSource
{
 [CmdletBinding()]
 param
 (
 [Parameter(Mandatory = $true,
 ValueFromPipeline = $true,
 ValueFromPipelineByPropertyName = $true)]
 [string]$datasource,
 [Parameter(Mandatory = $true,
 ValueFromPipeline = $true,
 ValueFromPipelineByPropertyName = $true,
 HelpMessage = 'Provide the full path to the DataSource')]
 [uri]$reportWebService,
 [Parameter(Mandatory = $true,
 ValueFromPipeline = $true,
 ValueFromPipelineByPropertyName = $true,
 HelpMessage = 'You must pass a pscredential object')]
 [System.Management.Automation.PSCredential]$Credentials,
 [switch]$NoTest
 )
 
 $datasource = Normalize-SSRSFolder $datasource
 try
 {
 $reportProxy = new-webserviceproxy -uri $reportWebService -Credential $credentials
 $reportType = $reportProxy.getitemtype($datasource)
 Write-Debug $reportType
 if ($reportType -eq 'DataSource')
 {
 try
 {
 $validObject = $reportProxy.Getdatasourcecontents($datasource)
 if ($validObject.gettype().name -eq 'DataSourceDefinitionOrReference' -or 'DataSourceDefinition')
 {
 
 if ($NoTest)
 {
 $validConnect = $false
 }
 else
 {
 $tempRef = $true # have to initialize a variable so it can be used as a reference in the next method call
 $validConnect = $reportproxy.TestConnectForItemDataSource($datasource, $datasource, ($validObject.username), ($validObject.password), ([ref]$tempRef))
 }
 $validObject | Add-Member -type NoteProperty -Name 'Valid' -Value $validConnect
 $validObject | Add-Member -Type NoteProperty -Name 'DataSource' -Value $datasource
 [pscustomobject]$validObject
 }
 else
 {
 $invalid = "invalidobject or permssion"
 [pscustomobject]@{
 'Extension' = $invalid
 'ConnectString' = $invalid
 'UseOriginalConnectString' = $false
 'OriginalConnectStringExpressionBased' = $false
 'CredentialRetrieval' = $invalid
 'ImpersonateUserSpecified' = $false
 'WindowsCredentials' = $false
 'Prompt' = $invalid
 'UserName' = $invalid
 'Password' = $invalid
 'Enabled' = $false
 'EnabledSpecified' = $false
 'Valid' = $false
 }
 }
 }
 catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }
 
 
 }
 }
 catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }
}

function Normalize-SSRSFolder
{
 param
 (
 [string]$Folder
 )
 
 if (-not $Folder.StartsWith('/'))
 {
 $Folder = '/' + $Folder
 }
 elseif ($Folder -match '//')
 {
 $Folder = $Folder.replace('//','/')
 }
 
 return $Folder
}

 

I hope this helps someone

Until then keep scripting

 

Thom

logo

Capture Web Page / HTML to JPG

I’m a member of an American Legion and as such I’ve been working with them on displaying images on screens for schedules and such.  So for a while I’ve been using various programs to capture an image from a website and save it to a jpg file.  So that got me to thinking there has to be a way to do this in Script.  So this article is about how I did just that.

First thing is I needed to find an easy way to bring in a webpage / html into memory for conversion to a Jpg.  After doing much searching I found this nice handy dandy module called NReco. Now that I have a Dll that I can import I can add this to my PowerShell script by doing an add-type:

Add-Type -Path ".\nreco\NReco.ImageGenerator.dll"

I chose for simplicity to put the dll in a sub folder where my script resides.  Now that I have the dll imported now on to seeing what the DLL can do for me.  According to the article the dll will convert an html to a jpg in one line of code.  So what  I chose to do is take advantage of the Invoke-WebRequest and just point it to www.powershell.org to see if it’d save a page for it.

$html = invoke-webrequest -uri 'https://powershell.org/forums/'
$h2image = new-object NReco.ImageGenerator.HtmlToImageConverter
$imageFormat = [NReco.ImageGenerator.ImageFormat]::Jpeg
$jpeg = $h2image.generateImage($html, $imageformat)
$dataStream = New-Object System.IO.MemoryStream(,$jpeg)
$img = [System.Drawing.Image]::FromStream($dataStream)
$img.save('c:\temp\image.jpg')

So the $h2image this is an object of the dll we pulled in which allows us to convert the webpage to a Jpg. Depending on the size of the page it may take a little while for this function to return.

$h2image = new-object NReco.ImageGenerator.HtmlToImageConverter

The next line of code the image format this tells the Dll what type of file we want to save it to. Through intellisense in the ISE you’ll notice there are 3 types included in this Enumeration.

nreco
For what I needed I chose JPG.

Now that I have the type of file and the type added I can now stream this webpage into memory:

$dataStream = New-Object System.IO.MemoryStream(,$jpeg)

This one took me a while to figure out if it hadn’t been for this article I may have never figured it out: http://piers7.blogspot.com/2010/03/3-powershell-array-gotchas.html

solution for getting the array to be streamed is in this tidbit:

Cup(Of T): 3 PowerShell Array Gotchas

The (somewhat counter-intuitive) solution here is to wrap the array – in an array. This is easily done using the ‘comma’ syntax (a comma before any variable creates a length-1 array containing the variable):

PS > $bytes = 0x1,0x2,0x3,0x4,0x5,0x6
PS > $stream = new-object System.IO.MemoryStream (,$bytes)
PS > $stream.length
6

Now that I have the html in a streamed variable I can now write this to a file using another dot net Class System.drawing.image 

$img = [System.Drawing.Image]::FromStream($dataStream)
$img.save('c:\temp\image.jpg')

And walla my web page is saved as a JPG.

image2

Full script:

Add-Type -Path ".\nreco\NReco.ImageGenerator.dll" 
$html = invoke-webrequest -uri 'https://powershell.org/forums/'
$h2image = new-object NReco.ImageGenerator.HtmlToImageConverter
$imageFormat = [NReco.ImageGenerator.ImageFormat]::Jpeg
$jpeg = $h2image.generateImage($html, $imageformat)
$dataStream = New-Object System.IO.MemoryStream(,$jpeg)
$img = [System.Drawing.Image]::FromStream($dataStream)
$img.save('c:\temp\image.jpg')

PowerShell Posse // Thom Schumacher – PowerShellPosse / DevOps

I hope this helps someone ..

Until then keep scripting

Thom

 

PS1 IIS WinMerge - Compare

Comparing IIS Sites

Recently I’ve run into an issue where I need to compare one site of the same application type to another server in another environment.  I decided to try and find a way to detect where the configuration is and then automatically launch a Compare tool.   I decided to use WinMerge. So this article will be about the scripting I wrote to get to the end goal comparing one site to another.

So I started with a function that gets the local environment variables most of the function I got from this post https://powershell.org/friday-fun-expand-environmental-variables-in-powershell-strings/ . I expanded on this function allowing for a computerName to be specified to allow for this computer name to be specified i had to write a function to test whether the computer was a local computer or not.  So i added a function called Get-LocalRemoteComputer

   <#
 .SYNOPSIS
 Determines if the name passed is the localhost or not
 
 .DESCRIPTION
 If the name passed is the localhost then the script will send back
 the computername: 
 
 .example
 get-localremotecomputer -computername .
 yourmachinename
 get-localremotecomputer -computername 127.0.0.1
 yourmachinename
 get-localremotecomputer -computername servername
 servername
 
 .PARAMETER computername
 A description of the computername parameter.
 
 .NOTES
 Additional information about the function.
#>
function Get-LocalRemoteComputer
{
 param
 (
 [string]$computername
 )
 
 if ($computername -eq '.' -or ($env:COMPUTERNAME -eq $computername)`
 -or ($computername -eq 'Localhost') -or ($computername -eq '127.0.0.1'))
 {
 $computername = $env:COMPUTERNAME
 $computername
 }
 else
 { $computername }
}

Now to explain what this function does.

I simply get the value of the computer name and compare it to the environment variable $env:computername or to 127.0.0.1 or local host. If it is the current computer you are running on it returns the computer name.

Else it returns the computername passed.

Now that i have the computer name that I’m going to operate on now I can get the remote variables from the machine by issuing a Invoke-Command with the computer name as shown below:

(invoke-command -ComputerName $computername -ScriptBlock `
 { param ([string]$t)get-item env:$t -ErrorAction SilentlyContinue } `
-ArgumentList $text).value

The modified function (allowing computernames)  from this post https://powershell.org/friday-fun-expand-environmental-variables-in-powershell-strings/  is shown below:

function Resolve-EnvVariable
{
 [CmdletBinding()]
 param
 (
 [Parameter(Mandatory = $true,
 ValueFromPipeline = $true,
 Position = 0,
 HelpMessage = 'Enter a string that contains an environmental variable like %WINDIR%')]
 [ValidateNotNullOrEmpty()]
 [string]$String,
 [string]$computername
 )
 #https://powershell.org/friday-fun-expand-environmental-variables-in-powershell-strings/
 Begin
 {
 $computerName = Get-LocalRemoteComputer -computername $computerName
 Write-Verbose "Starting $($myinvocation.mycommand)"
 
 } #Begin
 
 Process
 {
 #if string contains a % then process it
 if ($string -match "%\S+%")
 {
 Write-Verbose "Resolving environmental variables in $String"
 #split string into an array of values
 $values = $string.split("%") | Where { $_ }
 foreach ($text in $values)
 {
 #find the corresponding value in ENV:
 Write-Verbose "Looking for $text"
 if ($env:COMPUTERNAME -ne $computername)
 {
 [string]$replace = (invoke-command -ComputerName $computername -ScriptBlock `
{ param ([string]$t)get-item env:$t -ErrorAction SilentlyContinue }`
 -ArgumentList $text).value
 #(Get-Item env:$text -erroraction "SilentlyContinue").Value
 if ($replace)
 {
 #if found append it to the new string
 Write-Verbose "Found $replace"
 $newstring += $replace
 }
 else
 {
 #otherwise append the original text
 $newstring += $text
 }
 $newstring -match '\w:' | out-null
 if ($Matches)
 {
 $driveLetter = ($Matches.values).trim(':')
 $newstring = $newstring -replace '\w:', "\\$computername\$driveletter$"
 }
 }
 else
 {
 [string]$replace = (Get-Item env:$text -erroraction "SilentlyContinue").Value
 if ($replace)
 {
 #if found append it to the new string
 Write-Verbose "Found $replace"
 $newstring += $replace
 }
 else
 {
 #otherwise append the original text
 $newstring += $text
 }
 }
 } #foreach value
 
 Write-Verbose "Writing revised string to the pipeline"
 #write the string back to the pipeline
 Write-Output $NewString
 } #if
 else
 {
 #skip the string and write it back to the pipeline
 Write-Output $String
 }
 } #Process
 
 End
 {
 Write-Verbose "Ending $($myinvocation.mycommand)"
 } #End
} #end Resolve-EnvVariable
 

When you call this function in the begin block it calls the function described above and gets the name of the machine for  use in the rest of the function

 Begin
 {
 $computerName = Get-LocalRemoteComputer -computername $computerName
 Write-Verbose "Starting $($myinvocation.mycommand)"
 
 } #Begin

I needed this function to be able to parse the xml from IIS.  The IIS schema has the location of the config files in %variable% name fashion.

Now onto the next function.  The next function retrieves the current location of the installed IIS instance.  In the rare case it’s not in c: it will resolve what the current install is of the host and use that drive letter instead.

<#
 .SYNOPSIS
 Gets the current directory for IIS for the machine that is passed.
 
 .DESCRIPTION
 gets the current installed location of iis from the IIS 
 
 .PARAMETER computername
 string for computername that we want to find the current installation of iis
 
 .EXAMPLE
 Get-CurrentIISConfiguration -computername test
 returns: \\test\c$\windows\system32\inetsrv\config
#>
function Get-CurrentIIsConfiguration
{
 [OutputType([string])]
 param
 (
 [string]$computername
 )
 
 $computerName = Get-LocalRemoteComputer -computername $computerName
 if ($computerName -ne $env:COMPUTERNAME)
 {
 $startingdir = "\\$computername\c$\windows"
 }
 else
 {
 $startingdir = $env:windir
 }
 $mostRecentDir = "$startingdir\system32\inetsrv\config"
 $mostRecentDir
}

Now that I have the ccurrent installation of iis I can find where the history location of the most recent backup of iis is. I did this so if i need to compare an application host config from a backup to the most recent version I could do so. The name of this function is called Get-LastIISBackupLocation

<#
 .SYNOPSIS
 Gets the current backup location of IIS Configs
 
 .DESCRIPTION
 Gets the latest directory that contains the last IIS backup
 If you do not specify an index number then the function will get index 0 of the 
number of backups.  Each backup in IIS is a seperate directory the script 
determines how many there are and sets the first index number to the most 
recent backup.
 
 .PARAMETER computername
 this is as string value that represents the comptutername that we want to 
find the backups for
 
 .PARAMETER index
 This is a integer(16) value that represents the index number of the backup
 you want to retrieve. the most recent backup is index 0. 
 
 .NOTES
 This returns a string object of the backup location for the computername passed.
 .Example
 PS PS:\> Get-LastIISBackupLocation -computername test 1
 returns: \\test\C$\Windows\system32\inetsrv\backup\2016-07-14
 .Example 2
 PS PS:\> Get-LastIISBackupLocation -computername test 0
 \\test\C$\Windows\system32\inetsrv\backup\2016-07-15
#>
function Get-LastIISBackupLocation
{
 [OutputType([string])]
 param
 (
 [string]$computername,
 [int16]$index = '0'
 )
 
 $computerName = Get-LocalRemoteComputer -computername $computerName
 if ($computerName -ne $env:COMPUTERNAME)
 {
 $startingdir = Resolve-EnvVariable -String '%systemroot%' -computername $computername
 }
 else
 {
 $startingdir = $env:windir
 }
 $mostRecentDir = dir ("$startingdir\system32\inetsrv\backup") | Sort-Object -Property Lastwritetime -Descending | select -Index $index
 $mostRecentDir.fullname
}

Now that I have the last backup location i wanted to also be able to get the last incremental backup location.  When an administrator uses IIS everytime a configuration change is made IIS records that and puts a copy in the on most machines its in the c:\inetpub\history\cfghistor_XXXXX direcotry.

<#
 .SYNOPSIS
 Provides a means to get the current backup location for iis changes
 
 .DESCRIPTION
 Changes made to the IIS instance are recorded in a history config file. This function provides the means to retreive where it is on the machine
 
 Get-IISSystemHistoryLocation -index '2' 
 
 .PARAMETER index
 if a value for index is passed it'll get the x number from the collection of
 history backups. For instance if there are 5 backups and the index number
 passed is 2 then it'll get the next backup from the latest.
 
 .NOTES
 Additional information about the function.
#>
function Get-IISSystemHistoryLocation
{
 [OutputType([string])]
 param
 (
 [string]$computerName,
 [Parameter(Mandatory = $false)]
 [int16]$index = '1'
 )
 $computerName = Get-LocalRemoteComputer -computername $computerName
 if($computerName -ne $env:COMPUTERNAME)
 {
 $startingdir = Resolve-EnvVariable -String '%systemroot%' -computername $computername
 }
 else
 {
 $startingdir = $env:windir
 }
 $config = [xml](get-content `
 "$startingDir\system32\inetsrv\config\schema\IIS_schema.xml")
 $configHistory = (($config.configSchema.sectionschema`
 | where{ $_.name -like "*configHistory*" }).attribute | ?{ $_.name -like "path" }).defaultvalue
 $envVar = $configHistory | Resolve-EnvVariable -computername $computername
 $configDir = dir $envVar | Sort-Object -Property lastwritetime -Descending | select -Index $index
 $configd = $configdir.fullname
 $configd -match '\w:' | out-null
 if($Matches)
 {
 $driveLetter = ($Matches.values).trim(':') 
 $serverConfig = $configd -replace '\w:', "\\$computername\$driveletter$"
 $configd = $serverConfig
 }
 $configd
}

Now that I have the functions for getting the current installation, the current backup, and the history.  I can now use this information to pass onto winmerge so i can see the compare locally on my desktop.  to accomplish this I wrote a function specifically for winmerge that accepts a source and difference file and opens winmerge.

<#
 .SYNOPSIS
 takes to files passed and sends them to winmerge for comparison
 
 .DESCRIPTION
 This function will pass the source and differnece file and then launch winmerge.
 
 .PARAMETER sourceFile
 file used to be the source of the comparison
 
 .PARAMETER diffFile
 file that is the difference file
 
 .PARAMETER winMergeLoc
 Physical location of the exe for winmerge. This function will pass the source and differnece file and then launch winmerge.
 
 .NOTES
 Additional information about the function.
#>
function Compare-IISConfigsWinMerge
{
 param
 (
 [string]$sourceFile,
 [string]$diffFile,
 [string]$winMergeLoc
 )
 
 if (test-path $winMergeLoc)
 {
 if (test-path $sourcefile)
 {
 if (test-path $difffile)
 {
 & "$winmergeLoc " "$sourcefile " "$diffFile"
 $results = $true
 }
 else { $results = 'bad diff file' }
 }
 else {$results = 'bad source file'}
 }
 else {$results = 'bad winmerge location' }
 $results
}

The full script can be found on my github account in this gist.

Below is a screen shot of this comparison:

 

 

I hope this helps someone ..

Until then keep scripting

Thom

 

 

Cloning Vm’s in Hyper V

Nice Article on FoxDeploy about cloning vms in hyper V

It’s a common enough scenario: build one template machine and then mirror that to make a testlab, so you’d think this would be a built-in feature of Hyper-V, but its not. Luckily, it’s not too hard to do once you know how, and I’ll guide you through the tough parts Overall process We’ll be following […]

via Cloning VMs in Hyper-V — FoxDeploy.com

image_5f00_thumb_5f00_0a23d3bd

Updating Azure modules – In Azure Automation

Recently I wrote an article about Azure Alert Aggregation.  In that article I talked about the need to keep your modules up to date.   Since that time I spoke to @joDogLevy at Microsoft and he suggested that I submit a question on their forums:

https://social.msdn.microsoft.com/Forums/azure/en-US/eb95fcc1-d94f-42b1-a8d5-274d287bb2cd/updating-modules-in-azure?forum=azureautomation

He then provided the script to allow for updating modules in your Azure automation account as they become available.

I’ve since then updated the script to allow for a Subscription Name and a Credential from Azure Active directory.  Here is the script I came up with below.

param(
    [Parameter(Mandatory=$true)]
    [String] $ResourceGroupName,

    [Parameter(Mandatory=$true)]
    [String] $AutomationAccountName,

    [Parameter(Mandatory=$false)]
    [String] $AzCredential = 'MyAccount',

    [Parameter(Mandatory=$false)]
    [string] $subscriptionName = 'Azure Testing'

)

try {
 
    'Logging in to Azure...'
    $cred = Get-AutomationPSCredential -Name $AzCredential
    Add-AzureRmAccount -Credential $cred
}
catch {
    if(!$cred) {
        throw "Connection $AzCredential not found."
    }
    else {
        throw $_.Exception
    }
}
set-azurermContext -SubscriptionName $subscriptionName

$Modules = Get-AzureRmAutomationModule `
    -ResourceGroupName $ResourceGroupName `
    -AutomationAccountName $AutomationAccountName

$AzureRMProfileModule = Get-AzureRmAutomationModule `
    -ResourceGroupName $ResourceGroupName `
    -AutomationAccountName $AutomationAccountName `
    -Name 'AzureRM.Profile'

# Force AzureRM.Profile to be evaluated first since some other modules depend on it 
# being there / up to date to import successfully
$Modules = @($AzureRMProfileModule) + $Modules

foreach($Module in $Modules) {

    $Module = $Modules = Get-AzureRmAutomationModule `
        -ResourceGroupName $ResourceGroupName `
        -AutomationAccountName $AutomationAccountName `
        -Name $Module.Name
    
    $ModuleName = $Module.Name
    $ModuleVersionInAutomation = $Module.Version

    Write-Output "Checking if module '$ModuleName' is up to date in your automation account"

    $Url = "https://www.powershellgallery.com/api/v2/Search()?`$filter=IsLatestVersion&searchTerm=%27$ModuleName%27&targetFramework=%27%27&includePrerelease=false&`$skip=0&`$top=40" 
    Write-output $Url
    $SearchResult = Invoke-RestMethod -Method Get -Uri $Url -UseBasicParsing

    if(!$SearchResult) {
        Write-Error "Could not find module '$ModuleName' in PowerShell Gallery."
    }
    elseif($SearchResult.Length -and $SearchResult.Length -gt 1) {
        Write-Error "Module name '$ModuleName' returned multiple results. Please specify an exact module name."
    }
    else {
        $PackageDetails = Invoke-RestMethod -Method Get -UseBasicParsing -Uri $SearchResult.id 
        $LatestModuleVersionOnPSGallery = $PackageDetails.entry.properties.version

        if($ModuleVersionInAutomation -ne $LatestModuleVersionOnPSGallery) {
            Write-Output "Module '$ModuleName' is not up to date. Latest version on PS Gallery is '$LatestModuleVersionOnPSGallery' but this automation account has version '$ModuleVersionInAutomation'"
            Write-Output "Importing latest version of '$ModuleName' into your automation account"

            $ModuleContentUrl = "https://www.powershellgallery.com/api/v2/package/$ModuleName/$ModuleVersion"

            # Find the actual blob storage location of the module
            do {
                $ActualUrl = $ModuleContentUrl
                $ModuleContentUrl = (Invoke-WebRequest -Uri $ModuleContentUrl -MaximumRedirection 0 -UseBasicParsing -ErrorAction Ignore).Headers.Location 
            } while($ModuleContentUrl -ne $Null)

            $Module = New-AzureRmAutomationModule `
                -ResourceGroupName $ResourceGroupName `
                -AutomationAccountName $AutomationAccountName `
                -Name $ModuleName `
                -ContentLink $ActualUrl
                
            while($Module.ProvisioningState -ne 'Succeeded' -and $Module.ProvisioningState -ne 'Failed') {
                Start-Sleep -Seconds 10
            
                $Module = Get-AzureRmAutomationModule `
                    -ResourceGroupName $ResourceGroupName `
                    -AutomationAccountName $AutomationAccountName `
                    -Name $ModuleName

                Write-Output 'Polling for import completion...'
            }

            if($Module.ProvisioningState -eq 'Succeeded') {
                Write-Output "Successfully imported latest version of $ModuleName"
            }
            else {
                Write-Error "Failed to import latest version of $ModuleName"
            }   
        }
        else {
            Write-Output "Module '$ModuleName' is up to date."
        }
   }
}

I hope this helps someone ..

Until then keep scripting

Thom

AlertAggregator-Animation

Azure Alert Aggregator

Well it’s been a while since I last wrote up what I’ve been up to in the world of Scripting.

Note: there are two full scripts in this post

Disclaimer: This script works for me in its entirety and I’m getting ready to productionalize it.  My plan is to have it in a Module eventually.  For now the script and logic is what I’m presenting in this post.

I was brought a problem  where alerts in Azure Fire to frequently and was asked if I could create some sort of logic to detect when an alert fires (via webhook) and then operate and evaluate that alert (aggregate the alert data).   The below illustrates the coded PowerShell Scripts.

AlertAggregator-Animation

So I Started this venture by exploring how a webhook works.  This video gave a lot of insight into how a webhook works in azure. Thanx @pcGeek86.  With this information I was then able to take one of my alerts and discover how to get them to fire to my Azure Automation:

2016-05-10 15_32_36-Edit Rule - Microsoft Azure

In the webhook area I put in the address of the web hook that I created for my Run Book that is in my automation account.

2016-05-10 15_34_49-Settings - Microsoft Azure One thing to note about webhooks. I worked several times on getting the gui to create the webhook only to find that it was never saved when using the gui so I had to use PowerShell to create the webhook for my runbook.  Your results may very.

I took the example from –> help New-AzureRmAutomationWebhook -Examples

Modified it slightly to expire in 2 years ((get-Date).addyears(2)) .

$Webhook = New-AzureRmAutomationWebhook -Name "AlertHook" -IsEnabled $True -ExpiryTime ((get-Date).addyears(2)) -RunbookName "ContosoRunbook" -ResourceGroup 
 "ResourceGroup01" -AutomationAccountName "AutomationAccount01" -Force

I didn’t need parameters since the only thing I really need is the webhook object. This is something that the alerts take care of for you.

Now that I have my webhook created I can now write some code to consume the webhook. So I began with this snippet of code:

param ( 
        [object]$WebhookData
    )

    # If runbook was called from Webhook, WebhookData will not be null.
    if ($WebhookData -ne $null) {   

        # Collect properties of WebhookData
        $WebhookName    =   $WebhookData.WebhookName
        $WebhookHeaders =   $WebhookData.RequestHeader
        $WebhookBody    =   $WebhookData.RequestBody
        $write-output $webhookdata

The first item I look at in the webhook data is the value for Status. If My status is RESOLVED then I don’t need to take any further action if its some other status then I drop into the next set of logic:

 If($WebhookBody.Status -ne 'Resolved')    
{}
 else
 {
 write-output 'State has been resolved'
 }

I have  the webhook I need the rest of the data for processnig from the webhook.  I can do this by converting the Json data into a powershell object

$WebhookBody = (ConvertFrom-Json -InputObject $WebhookBody)

Now that I know what is in my webhookbody I can now write out each of the values for each of the items in the object I’ll refer to the data from this Json parse as AlertContext or as AlertPayload (interchangeably):

 write-output "`nALERT CONTEXT DATA"
 write-output '==================='
 write-output "Name:`t`t $($AlertContext.name)"
 write-output "Subscriptionid:`t`t $($AlertContext.subscriptionId)"
 write-output "MetricName:`t`t $($AlertContext.condition.MetricName)"
 write-output "MetricValue:`t`t $($AlertContext.condition.metricvalue)"
 write-output "Threshold:`t`t $($AlertContext.condition.Threshold)"
 write-output "ResourceGroupName:`t`t $($AlertContext.resourceGroupName)"
 write-output "ResourceName:`t`t $($AlertContext.resourceName)"
 write-output "ResourceType:`t`t $($AlertContext.resourceType)"
 write-output "ResourceID:`t`t $($AlertContext.resourceId)"
 write-output "Timestamp:`t`t $($AlertContext.timestamp)"

Now that I have that detail now comes the harder part. This is where I spent 5 days out of this whole process.   The thing I learned is if you receive an error:

Run Login-AzureRmAccount to login

The first thing you should do is check the version of your azure modules in your automation account and ensure they are up to snuff.

Now that we have our azure modules up to date we can begin the process of logging in as an AD account.  You could use the Service principal but that may take quite a bit more setup and is outside the scope of this Blog article.

To do this you’ll need to create a Credential Asset in your automation account which is demonstrated here. Then we’ll use the Credential Asset in our automation with the  Get-AutomationPSCredential.  Once we have the credential asset in our runbook now we need to use Login-AzureRmAccount which is an alias for Add-AzureRmAccount.

$cred = Get-AutomationPSCredential -Name 'ThomSchumacher'
 write-output 'login to azure with automation account'
 Add-AzureRmAccount -Credential $cred
Set-AzureRmContext -subscriptionName 'Azure Testing' 

If you have more than one subscription you’ll need to make certain you change to the same subscription that your azure automation account is running in.  In the alert payload we get the following items, subscriptionid, PortalLink, AlertName, ResourceType, ResourceId, Alert Threshold, Alert WindowSize, & Alert Metric Name.

Since I know I need each one of these for scheduling the Azure automation. I’m going to create variables  that contain each of these items:

 $AlertContext = [object]$WebhookBody.context
 $credential ='ThomSchumacher'
 $SubscriptionId = $AlertContext.subscriptionId
 $portalLink = $AlertContext.portalLink
 $AlertName = $AlertContext.name
 $ResourceType = $AlertContext.resourceType
 $ResourceId = $AlertContext.resourceId
 $AlertMetric= $AlertContext.condition.Threshold
 $WindowSize= ($AlertContext.condition.windowsize) /2 +1
 $AlertDateTime = get-date
 $AlertMetricName = $AlertContext.condition.MetricName
 $counterType = ($AlertContext.condition.MetricName) -replace (' ','')
 $subscriptionName = 'Azure Testing' 
 $resourceId = ($AlertContext.resourceId) 
 $minutes = '1'
 $jobName = "CaptureAlerts-$counterType"
 $resoureGroupName = 'AzureTesting'
 $AutomationAccountName = 'AutomationAccountTest'
 $runbookName = 'check4Alerts'
 $description = "$counterType`: checkforAlerts job"

 $ht = @{}; Get-Variable -Name ('Subscriptionid','Portallink','alertname',`
'resourcetype','resourceid','alertmetric','alertdatetime','countertype') `
| foreach { $ht.Add($_.Name,$_.Value)}

Then I’ll create a $ht (hastable) object will be passed on to the runbook that will be scheduled.

 

Now I’ll need the rest of the values passed is what is our Alert WindowSize.   The Window size is actually the amount of time that you tell Azure to alert on your value to be watched see below:

windowSize

The values are 5, 10, 15, 30, 45, 1hour, 2hours, 3hours, 4hours, 5hours and 6hours.

I’ve decided for my purposes I’m going to stick to 45 minutes or less. So I’ll use that value to make certain I’m not over that value. If I’m over the 45 minute mark I’m not going to fire my automation.  One other thing I’ve found is that if you try and create a schedule for your runbook that is less than 5 minutes then Azure will not allow you to schedule it.

So I have two situations I must account for.

  1. Any windowsize that is 5 minutes we must run for at least 3 minutes. This would be half the sample size plus one minute. This situation we cannot setup a schedule for the runbook
  2. If the windowsize is greater than 45 minutes then write a message that the windowsize is greater than accepted values.
  if(($WindowSize -ge 6) -and ($WindowSize -le 45))

Now that I have the situations accounted for if the value is over 5 minutes I can create a schedule for my child run book which will actually check the alert values.  The first thing I check is to see if there is a jobname already defined.

Get-AzureRmAutomationSchedule -name $jobname -ResourceGroupName `
$resoureGroupName -AutomationAccountName $AutomationAccountName `
-Verbose -ErrorAction SilentlyContinue

If there is I must delete it as I have no way of rescheduling the job.  There is no commandlet for it at this time.

Remove-AzureRmAutomationSchedule -Name $jobName -ResourceGroupName`
 $resoureGroupName -AutomationAccountName $AutomationAccountName -Force

Now I know I can create a schedule and then associate(register) the schedule to my runbook. When I create the schedule I add the windowsize to my when I want the schedule to run… Which if you look at the variable earlier the size is set to the windowsize / 2 + 1.

 write-output "create a new adoc schedule $jobName"
 write-output "new-azurermAutomationschedule -Name $jobName -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -StartTime ((get-date).AddMinutes($windowSize)) -OneTime -Description $description "
 new-azurermAutomationschedule -Name $jobName -ResourceGroupName $resoureGroupName `
 -AutomationAccountName $AutomationAccountName `
-StartTime ((get-date).AddMinutes($windowSize)) -OneTime -Description $description 
 write-output "Adding the job schedule to the runbook $runBookName"
 Write-Output "register-AzureRmAutomationScheduledRunbook -RunbookName $runbookName -ScheduleName $jobName -Parameters $ht -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName"
 $newSchedule = register-AzureRmAutomationScheduledRunbook `
 -RunbookName $runbookName -ScheduleName $jobName `
-Parameters $ht -ResourceGroupName $resoureGroupName `
 -AutomationAccountName $AutomationAccountName -Verbose

I’ve chosen to call this script CaptureAlerts.ps1 here is the entire script:

#requires -Version 3 -Modules Azure, AzureAutomationAuthoringToolkit, AzureRM.Insights, AzureRM.Profile,AzureRm.Automation
param( [object]$WebhookData)

if ($WebhookData -ne $null)
 { 
 # Collect properties of WebhookData.
 $WebhookName = $WebhookData.WebhookName
 $WebhookBody = $WebhookData.RequestBody
 $WebhookHeaders = $WebhookData.RequestHeader
 write-output $WebhookData 
 # Information on the webhook name that called This
 write-output "This runbook was started from webhook $WebhookName."
 # Obtain the WebhookBody containing the AlertContext
 $WebhookBody = (ConvertFrom-Json -InputObject $WebhookBody) 
 write-output "`nWEBHOOK BODY"
 write-output '============='
 write-output "Status:`t`t $($WebhookBody.status)"
 If($WebhookBody.Status -ne 'Resolved') 
 {
 $AlertContext = [object]$WebhookBody.context
 $credential ='ThomSchumacher'
 $SubscriptionId = $AlertContext.subscriptionId
 $portalLink = $AlertContext.portalLink
 $AlertName = $AlertContext.name
 $ResourceType = $AlertContext.resourceType
 $ResourceId = $AlertContext.resourceId
 $AlertMetric= $AlertContext.condition.Threshold
 $WindowSize= ($AlertContext.condition.windowsize) /2 +1
 $AlertDateTime = get-date
 $AlertMetricName = $AlertContext.condition.MetricName
 $counterType = ($AlertContext.condition.MetricName) -replace (' ','')
 $subscriptionName = 'Azure Testing' 
 $resourceId = ($AlertContext.resourceId) 
 $minutes = '1'
 $jobName = "CaptureAlerts-$counterType"
 $resoureGroupName = 'AzureTesting'
 $AutomationAccountName = 'AutomationAccountTest'
 $runbookName = 'check4Alerts'
 $description = "$counterType`: checkforAlerts job"

 $ht = @{}; Get-Variable -Name ('Subscriptionid','Portallink','alertname','resourcetype','resourceid','alertmetric','alertdatetime','countertype') | foreach { $ht.Add($_.Name,$_.Value)}
 write-output "Check for the existence of the adhoc schedule for CaputureAlerts Runbook -- $jobname"
 write-output 'login to azure with automation account'

 #$Conn = Get-AutomationConnection -Name AzureRunAsConnection 
 #Add-AzureRMAccount -ServicePrincipal -Tenant $Conn.TenantID -ApplicationId $Conn.ApplicationID -CertificateThumbprint $Conn.CertificateThumbprint
 $cred = Get-AutomationPSCredential -Name 'ThomSchumacher'
 write-output 'login to azure with automation account'
 Add-AzureRmAccount -Credential $cred
 Set-AzureRmContext -subscriptionName 'Azure Testing'
 Write-output "Window Size must be between 10 and 45 minutes for this automation to work"
 if(($WindowSize -ge 6) -and ($WindowSize -le 45))
 {
 Write-Output "Get-AzureRmAutomationSchedule -name $jobname -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -Verbose -ErrorAction SilentlyContinue"
 if(Get-AzureRmAutomationSchedule -name $jobname -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -Verbose -ErrorAction SilentlyContinue)
 {
 Write-Output "Remove-AzureRmAutomationSchedule -Name $jobName -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -Force"
 Remove-AzureRmAutomationSchedule -Name $jobName -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -Force
 }
 write-output "create a new adoc schedule $jobName"
 write-output "new-azurermAutomationschedule -Name $jobName -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -StartTime ((get-date).AddMinutes($windowSize)) -OneTime -Description $description "
 new-azurermAutomationschedule -Name $jobName -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -StartTime ((get-date).AddMinutes($windowSize)) -OneTime -Description $description 
 write-output "Adding the job schedule to the runbook $runBookName"
 Write-Output "register-AzureRmAutomationScheduledRunbook -RunbookName $runbookName -ScheduleName $jobName -Parameters $ht -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName"
 $newSchedule = register-AzureRmAutomationScheduledRunbook -RunbookName $runbookName -ScheduleName $jobName -Parameters $ht -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -Verbose
 }
 else
 {
 Set-AzureRmContext -subscriptionName 'Azure Testing'
 write-output "subtrackting $windowsize Minutes"
 $AlertDateTime = ((get-date).AddMinutes(-($windowsize)))
 $ht = @{}; Get-Variable -Name ('Credential','Subscriptionid','Portallink','alertname','resourcetype','resourceid','alertmetric','alertdatetime','countertype') | foreach { $ht.Add($_.Name,$_.Value)}
 Write-Output "Starting $runbookName paramaters $ht"
 Write-Output "Start-AzureRmAutomationRunbook -Name $runbookName -Parameters $ht -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -Verbose"
 Start-AzureRmAutomationRunbook -Name $runbookName -Parameters $ht -ResourceGroupName $resoureGroupName -AutomationAccountName $AutomationAccountName -Verbose
 }
 }
 else
 {
 write-output 'State has been resolved'
 }
 }
 else 
 {
 Write-Error 'This runbook is meant to only be started from a webhook.' 
 }

Now on to the second script: 

What this script does is take values passed to it from the webhook above or from any other scripting.

The paramaters it requires are the same as what is defined in the hastable from the first script:

$ht = @{}; Get-Variable -Name ('Subscriptionid'`
,'Portallink','alertname','resourcetype','resourceid','alertmetric'`
,'alertdatetime','countertype') | foreach { $ht.Add($_.Name,$_.Value)}

Again we’ll have to get a credential and login:

To do this you’ll need to create a Credential Asset in your automation account which is demonstrated here. Then we’ll use the Credential Asset in our automation with the  Get-AutomationPSCredential.  Once we have the credential asset in our runbook now we need to use Login-AzureRmAccount which is an alias for Add-AzureRmAccount.

$cred = Get-AutomationPSCredential -Name 'ThomSchumacher'
 write-output 'login to azure with automation account'
 Add-AzureRmAccount -Credential $cred
Set-AzureRmContext -subscriptionName 'Azure Testing' 

If you have more than one subscription you’ll need to make certain you change to the same subscription that your azure automation account is running in.

Now we’ll do some calculations based on the data passed to this runbook.  First thing we need to do is get the current time so we can inform our user what the end date of the schedule is and this is the value that we use to tell the scripting the end time as well.

$nowTime = get-date

Since we have the time we’ll need to get the metrics for the object passed to this Script by using the Get-AzureMetric Cmdlet.

 Write-output "(((Get-AzureRmMetric -ResourceId $resourceId -StartTime $AlertDateTime -EndTime (get-date) -TimeGrain (new-timespan -Minutes 1)).where{$_.name -eq $counterType}).metricvalues).total"
 $numberOfCounters = (((Get-AzureRmMetric -ResourceId $resourceId `
 -StartTime $AlertDateTime -EndTime $nowTime `
-TimeGrain (new-timespan -Minutes 1)).where{$_.name -eq $counterType}).metricvalues).total

The value of $numberofCounters is equal to the number of counters for the given metric.

The Get-AzureMetric values are updated every minute this is why the script is designed around minutes.  Now what we can do is calculate the number of counters that are greater than our MetricValue or what is also known as Threshold in the Azure blades. Now we need to find out what half the measurement is using the variable $halfMetric.  This is half the count of $numberOfCounters.

 $overMetric = ($numberOfCounters.where{$PSItem -gt $metricValue}).count
 $halfMetric = [math]::Floor( $numberOfCounters.count /2) 
 Write-output "calculating the number of metrics collected to see if we were over for half the metric time for MetricVslue: $metricValue"
 Write-output "half the metrics rounded down: $halfmetric"
 Write-output "Number of metrics that were over the metrice value: $overmetric"

If the value of $overmetric is greater than $halfmetric then we are going to send an email.

For the purposes of my design I used Send-Grid

Write-output 'Alert would be thrown'
 Write-output 'If alertstatus variable is defined set the email flag to true.'
 Write-output "MetricValue: $metricValue Number of Alerts over the metric ------: $overmetric "
 $message = "The following alert $($AlertName) has exceeded the AlertThreshold $($Alertmetric). It Exeeded it $($overMetric) times. Alert First triggered $(($alertDateTime).DateTime) -- Ending measurement Time $(($nowTime).DateTime). Portal Address for this alert $($Portallink)" 
 Write-output $message
 Write-Output $AlertName 
 $sendGridApiUser = 'your api user'
 $sendGridApiPwd = 'yourPassword'
 $sendGridApiKey = 'yourkey'
 $from = 'someemailIpicked@ok.com'
 $to = Get-AutomationVariable -Name 'EmailAddress'
 $subject ="Alert From $Alertname - Exceeded threshold $overmetric Times - ResourceType: $resourceType"
 [uri]$sendGridApi = 'https://api.sendgrid.com/api/mail.send.json'
 write-output "building To list $to"
 if($to.Contains(',') -or ($to.countains("'")) -or ($to.countains('"')))
 {
 write-output 'To: removing uneeded chars and putting in Array'
 $to= $to -replace ("'",'')
 $to= $to -replace ('"','')
 $to = $to.split(',')
 } 
 foreach($t in $to)
 {
 $mailto +="&to=$t"
 }
 $SendGridPost = "api_user=$sendGridApiUser&api_key=$sendGridApiPwd&to=$mailto&subject=$subject&text=$message&from=$from"
 Invoke-RestMethod -Method post -Uri $sendGridApi -Body $SendGridPost 
 }

Here is that script in its entirety

Param
 (
 [Parameter(Mandatory=$true, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
 [string]$SubscriptionId, 
 [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
 [string]$portalLink, 
 [Parameter(Mandatory=$true, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
 [string]$alertName, #CPUHigh ApiDev
 [Parameter(ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
 [string]$resourceType, #microsoft.web/serverfarms
 [Parameter(Mandatory=$true, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
 [string]$resourceId, 
 [Parameter(Mandatory=$true, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
 [string]$alertMetric, #2 aka threshold
 [Parameter(Mandatory=$true, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
 [datetime]$AlertDateTime, #
 [Parameter(Mandatory=$true, ValueFromPipeline=$true, ValueFromPipelineByPropertyName=$true)]
 [string]$counterType #CPUPercentage
 )
 $cred = Get-AutomationPSCredential -Name 'ThomSchumacher'
 write-output 'login to azure with automation account'
 Add-AzureRmAccount -Credential $cred
 $write
 if($cred)
 {
 $nowTime = get-date
 write-output 'login to azure with automation account'
 Set-AzureRmContext -subscriptionName 'Azure Testing'
 Write-output "(((Get-AzureRmMetric -ResourceId $resourceId -StartTime $AlertDateTime -EndTime (get-date) -TimeGrain (new-timespan -Minutes 1)).where{$_.name -eq $counterType}).metricvalues).total"
 $numberOfCounters = (((Get-AzureRmMetric -ResourceId $resourceId -StartTime $AlertDateTime -EndTime $nowTime -TimeGrain (new-timespan -Minutes 1)).where{$_.name -eq $counterType}).metricvalues).total
 $overMetric = ($numberOfCounters.where{$PSItem -gt $metricValue}).count
 $halfMetric = [math]::Floor( $numberOfCounters.count /2) 
 Write-output "calculating the number of metrics collected to see if we were over for half the metric time for MetricVslue: $metricValue"
 Write-output "half the metrics rounded down: $halfmetric"
 Write-output "Number of metrics that were over the metrice value: $overmetric"
 Write-output "Cpu percentage numbers: $cpuPercentage"
 if($overMetric -gt $halfmetric) #need to throw email if the number of overmetrics is greater than the under metric
 { 
 Write-output 'Alert would be thrown'
 Write-output 'If alertstatus variable is defined set the email flag to true.'
 Write-output "MetricValue: $metricValue Number of Alerts over the metric ------: $overmetric "
 $message = "The following alert $($AlertName) has exceeded the AlertThreshold $($Alertmetric). It Exeeded it $($overMetric) times. Alert First triggered $(($alertDateTime).DateTime) -- Ending measurement Time $(($nowTime).DateTime). Portal Address for this alert $($Portallink)" 
 Write-output $message
 Write-Output $AlertName 
 $sendGridApiUser = 'your api user'
 $sendGridApiPwd = 'yourPassword'
 $sendGridApiKey = 'yourkey'
 $from = 'someemailIpicked@ok.com'
 $to = Get-AutomationVariable -Name 'EmailAddress'
 $subject ="Alert From $Alertname - Exceeded threshold $overmetric Times - ResourceType: $resourceType"
 [uri]$sendGridApi = 'https://api.sendgrid.com/api/mail.send.json'
 write-output "building To list $to"
 if($to.Contains(',') -or ($to.countains("'")) -or ($to.countains('"')))
 {
 write-output 'To: removing uneeded chars and putting in Array'
 $to= $to -replace ("'",'')
 $to= $to -replace ('"','')
 $to = $to.split(',')
 } 
 foreach($t in $to)
 {
 $mailto +="&to=$t"
 }
 $SendGridPost = "api_user=$sendGridApiUser&api_key=$sendGridApiPwd&to=$mailto&subject=$subject&text=$message&from=$from"
 Invoke-RestMethod -Method post -Uri $sendGridApi -Body $SendGridPost 
 }
 else
 {
 Write-output 'no alert needed' 
 Write-output "MetricValue: $metricValue Number of Alerts with the metric: $overmetric"
 $stop = $true
 } 
 }

 

It was a long haul but worth it I hope this Blog Post helps someone else out in their quest to keep alerting down from azure.

 

Until then

 

Keep Scripting

 

Thom

Sharepoint RootFolder in Query String

csm_sp2013logo_0abaeddb68                     reportingservices

I’ve been working on a GUI program in Sapien Powershell Studio.  This program deploys SSRS reports to a integrated SSRS and Sharepoint instance.  While writing this I discovered that to upload the Files to sharepoint you must have the full path structure to where you are placing the file.  Well turns out what sharepoint gives you when you are browsing to a subfolder is this very weird url with query strings in it. To ensure that I get what I want from my user and I know where to upload the file I had to write a function to parse the query string and return it to my calling function so I could upload properly. This article is about how i was able to get this to function.

First lets look at  the query string I was dealing with:

http://servername/sites/mysite/Shared%20Documents/Forms/Allitems.aspx?RootFolder=%2fsites%2fmysite%2fTest&folderCTID=0x0120004845B289B9EC1E479DF75ADD1F150A9E

So if we look at this query string closely we can see the real folder name for my Sharepoint site and location is after the RootFolder query string %2fsites%2fmysite%2fTest 

So with that in mind I started looking for a way to parse this string and get out what I wanted after the RootFolder query string.  So I attempted using string manipulation and several other methods.  I kept thinking to myself there must be a better way.  So I dug deeper and I found a #Csharp Class that deals with urls: [System.Web.HttpUtility]

Now that I have something to deal with urls.  I sleuth around the class with Powershell until I found this function ParseQueryString.  Now if I take my url and feed it to this class:

PS PS:\> [uri]$Url = ‘http://servername/sites/mysite/Shared%20Documents/Forms/Allitems.aspx?RootFolder=%2fsites%2fmysite%2fTest&folderCTID=0x0120004845B289B9EC1E479DF75ADD1F150A9E&#8217;

PS PS:\> [System.Web.HttpUtility]::ParseQueryString($url)
http://servername/sites/mysite/Shared Documents/Forms/Allitems.aspx?RootFolder
folderCTID

My return results don’t seem to be what I’m looking for. Since my return results contain two items which look to be an array.  I’ll query them via index number:

PS PS:\> $a = ([System.Web.HttpUtility]::ParseQueryString($url))

PS PS:\> $a[0]
/sites/mysite/Test

Interesting The results are in the first array member so this must be a hashtable of some sort?

PS PS:\> $a.gettype()

IsPublic IsSerial Name BaseType
——– ——– —- ——–
False True HttpValueCollection System.Collections.Specialized.NameValueCollection

No it’s not a Hashtable but a specialized System.Collections.Specialized.NameValueCollection. Hmmm so how do i get the value I want out of the collection?

PS PS:\> $a | get-member -MemberType Properties

TypeName: System.String

Name MemberType Definition
—- ———- ———-
Length Property int Length {get;}

The only property it has is Length which doesn’t really give me what I want the actual text from the query string.  So it must be a method that I need to use:

PS PS:\> $a | get-member -MemberType Method
TypeName: System.String

Name MemberType Definition
—- ———- ———-
Clone Method System.Object Clone(), System.Object ICloneable.Clone()
CompareTo Method int CompareTo(System.Object value), int CompareTo(string strB), int IComparable.CompareTo(System.Object obj), int IComparable[string].CompareTo(string other)
Contains Method bool Contains(string value)

Hmm this variable now says its of type string and all the methods are string methods. After looking around for some time I finally tried somethings in ISE and I got to the method I was looking for: GetValues

system.collections.specilaized.namevaluecollection.getValues

So now I can attempt to see if I can get the values I want from this object:

PS PS:\> $a.GetValues('http://servername/sites/mysite/Shared Documents/Forms/Allitems.aspx?RootFolder')
/sites/mysite/Test

Cool I got what i was looking for the value for RootFolder query string. Now I’m half way there. Now all that is needed is to construct the getValues method call to only get what I’m looking for RootFolder. 

PS PS:\> $a.GetValues(($a.keys | Where-Object{$_ -like "*RootFolder*"}))
/sites/mysite/Test

Now I have the value I want so I can now create a function to return the url I need for my program.  Here is the full Function:

function Format-SharepointUrl
{
param
(
[parameter(Mandatory = $true)]
[uri]$url
)
$newUrl = $null
[System.Collections.Specialized.NameValueCollection]$uCollection = [System.Web.Httputility]::ParseQuerySTring($url)
if ($uCollection.GetValues(($ucollection.keys | ?{ $_ -like "*RootFolder*" })))
{
$cUrl = $uCollection.GetValues(($ucollection.keys | ?{ $_ -like "*RootFolder*" }))
$newUrl = "$($url.Scheme)://$($url.Host)$cUrl"
}
$newUrl
}

I hope this helps someone ..

 

Until then keep scripting

Posting Azure Audit results to Office 365 Sharepoint

Recently I blogged about Auditing azure resources and I also blogged about how to work with the OficeDevPnP.PowerShell Module.  This blog is about how i used both of these items to give myself a Sharepoint list that contains the Azure resources I audited.  This allows for my users to know what is in Azure.  this also allows me to put an Application name to the item I pulled from azure.

Here is what My list looks like in Sharepoint:

image

So now since I have my list built i can import the OfficeDevPnP module:
import-module OfficeDevPnP.PowerShell.V16.Commands

And then connect to my office 365 instance:

Connect-SPOnline –Url 'https://my.sharepoint.com/departments/mysite'

Ok so I have my connection to my site now I need to retrieve my azure listing:

Import-azurerm
Login-AzureRMAccount -credential $myAzureCredentials

Then using the Get-DaAazureRMResources function that i blogged about here i can get all my resources for posting to office 365 list:

$myAzureListing = Get-DaAzureRMResources

In order to post to my SharePoint list  I have a field called application which doesn’t exist in Azure but does in my SharePoint list. The next loop will add this Application property to my object.

foreach($a in $myAzureListing )
{ write-debug "$($a.SubscriptionName)"

With the switch we are going to add a field to the Azure listing called Application so that it meets the requirements of my Office 365 list.   I have two cases where I need the name to be different than what is specified(Azure subscriptionname). So i put those two cases in the switch to account for them.
switch -Wildcard ($a.SubscriptionName)
{
"myApp*" {Add-Member -MemberType NoteProperty -Name 'Application' -InputObject $a -Value 'ThomsApp' }
"CRM" {add-member -MemberType NoteProperty -Name 'Application' -InputObject $a -value 'MyCRM' }
Default
{
Add-Member -MemberType NoteProperty -Name 'Application' -InputObject $a -Value $($a.SubscriptionName )
}
}
}

The Azure listing that my data is going to has a linked list to another list in SharePoint called Applications.  In order to post the new items from the azure pull I’ll need to get the id of each application from the other list. The view of the list is shown below:

image

The applicationlist variable is where a get of the list is done and then the applicationlistObj contains the list in object form.
$applicationList = (Get-SPOListItem -List 'Applications' ).fieldvalues
$applicationListObj = $applicationList | ForEach-Object{New-Object psobject -Property $_}

Now I need to get my existing Azure Assets list to ensure I don’t put any duplicates in this list

$azureSPlistName = 'AzureAssets'
$azureSpList = (get-spolistitem -list 'AzureAssets').fieldvalues | foreach-object{new-object psobject -Property $_}

Now we can loop through each item in the Azure list to put them into our SharePoint list.
foreach($item in $azureList)
{

current assets variable gets us the record that matches the Item we are currently on in our loop.

$currentAssets = $azureSpList |?{$_.ResourceId -eq $item.resourceid}

Here we need to get the application id. this is so that when we update the list we use the value of the application name instead of the friendly name.

$appId = ($applicationListObj | ?{$_.Title -eq $item.application}).id

Now will build our hash to update to the sharepoint list

$azureHash = @{'Title' = $($item.Name);'Application' = $appId ; 'SubscriptionName' = $item.SubscriptionName;'ResourceName' = $item.Resourcename; 'ResourceType' = $item.ResourceType ; 'ResourceGroupName' = $item.ResourceGroupName; 'Tags' = $item.Tags; 'Location' = $item.Location; 'ResourceId'= $item.ResourceId; 'SubscriptionId' = $item.SubscriptionId }

Now we check to see if the record already exists in the SharePoint list if it does we’ll need to perform an update.

If($currentAssets) #we found the record in the sharepoint site now we need to update.
{
Set-SPOListItem -List $azureSPlistName -Identity $currentAssets.id -Values $azureHash
}
else #since we know there isn't another record of this type in our sharepoint list we'll just add it.
{
add-spolistitem -list $azureSPlistName -Values $azureHash
}
}

Hopefully this helps someone with updating or making a new list from objects in Azure.

Full script is below:

import-module OfficeDevPnP.PowerShell.V16.Commands
Connect-SPOnline –Url 'https://my.sharepoint.com/departments/mysite' -Credentials $sp2credentials
$myAzureListing = Get-DaAzureRMResources
foreach($a in $myAzureListing)
{ write-debug "$($a.SubscriptionName)"
switch -Wildcard ($a.SubscriptionName)
{
"myApp*" {Add-Member -MemberType NoteProperty -Name 'Application' -InputObject $a -Value 'ThomsApp' }
"CRM" {add-member -MemberType NoteProperty -Name 'Application' -InputObject $a -value 'myCRM' }
Default
{
Add-Member -MemberType NoteProperty -Name 'Application' -InputObject $a -Value $($a.SubscriptionName )
}
}
}
$applicationList = (Get-SPOListItem -List 'Applications' ).fieldvalues
$applicationListObj = $applicationList | ForEach-Object{New-Object psobject -Property $_}
$azureSPlistName = 'AzureAssets'
$azureSpList = (get-spolistitem -list 'AzureAssets').fieldvalues | foreach-object{new-object psobject -Property $_}
foreach($item in $azureList)
$currentAssets = $azureSpList |?{$_.ResourceId -eq $item.resourceid}
$appId = ($applicationListObj | ?{$_.Title -eq $item.application}).id
$azureHash = @{'Title' = $($item.Name);'Application' = $appId ; 'SubscriptionName' = $item.SubscriptionName;'ResourceName' = $item.Resourcename; 'ResourceType' = $item.ResourceType ; 'ResourceGroupName' = $item.ResourceGroupName; 'Tags' = $item.Tags; 'Location' = $item.Location; 'ResourceId'= $item.ResourceId; 'SubscriptionId' = $item.SubscriptionId }
If($currentAssets)
{
Set-SPOListItem -List $azureSPlistName -Identity $currentAssets.id -Values $azureHash
}
else
{
add-spolistitem -list $azureSPlistName -Values $azureHash
}
}

Until then keep scripting