Just Hash It

I have been looking high and low for a good means to compare one variable to another and do it quickly.  In my search I found this article on Stack Over flow.  This led me to create a function that you could use for comparing one variable to another and getting a simple $true or $false answer if they are the same or not.   This article explains that concept

To start with I need to create a function block and pass two parameters. The item i’m using as a reference $reference and the item/variable I’ll use as the difference.


function Compare-Variables
{
param([string]$Reference, [string]$difference)

}

Using the example from the Post on Stack overflow I need to create an Object to hold the text encoding System.Text.UTF8Encoding  System.Security.Cryptography.MD5CryptoServiceProvider and a System.BitConverter.

To work backwards from the object to the comparison here is what takes place.

Step 1: Take the contents of each variable and turn them in to json – Using Converto-Json

$ref = $reference.CacheValue | ConvertTo-Json -Depth 100
$diff = $difference.CacheValue | ConvertTo-Json -Depth 100 

Now that I have it in a json object (so long as the object isn’t nested beyond 100) I’ll have the entire variable and it’s children objects.  This is in a variable called $ref and $diff.

Step 2: Since I have those in a variable I can get the bytes for the variable from Calling the UTF8 getBytes method.  Using a variable of test you’ll see that I get back a set of bytes for each character:


$utf8 = [System.Text.UTF8Encoding]::new()

$utf8.GetBytes('test')
116
101
115
116 

Step 3: Now that my variable is in bytes I can now compute a hash for those bytes with the System.Security.Cryptography.MD5CryptoServiceProvider


$md5 = [System.Security.Cryptography.MD5CryptoServiceProvider]::new()
$md5.ComputeHash($utf8.GetBytes('test'))
9
143
107
205
70
33
211
115
202
222
78
131
38
39
180
246 

Step 4: Now that I have my Computed hash I can convert this into a readable MD5 Sum with  System.BitConverter.

[System.BitConverter]::ToString($md5.ComputeHash($utf8.GetBytes('test')))
09-8F-6B-CD-46-21-D3-73-CA-DE-4E-83-26-27-B4-F6
 

Step 5: Now that I have that for both my variables I can simply ask if $ref -eq $diff and get an answer of $true or $false.

Completed Script is below:

function Compare-Variables
{
 param([object]$Reference, [object]$difference, [int]$objectDepth='2')
 $utf8 = [System.Text.UTF8Encoding]::new()
 $match = $false
 $md5 = [System.Security.Cryptography.MD5CryptoServiceProvider]::new()
 $ref = $reference | ConvertTo-Json -Depth $objectDepth
 $diff = $difference | ConvertTo-Json -Depth $objectDepth
 $hashref = [System.BitConverter]::ToString($md5.ComputeHash($utf8.GetBytes($ref)))
 $hashdif = [System.BitConverter]::ToString($md5.ComputeHash($utf8.GetBytes($diff)))
 $match = $hashref -eq $hashdif
 $hashref = $diff = $ref = $utf8 = $md5 = $null
 $match
}
 

Testing this Function:

Simple Test with just text: Now I can call this function and get a $true if the variables match and a false if they don’t.

$a = 'test'
$b = 'test'
Compare-Variables -Reference $a -difference $b
True
$a = 'test2'
$b = 'test'
Compare-Variables -Reference $a -difference $b
False

Test with an object from a Rest API: Now lets try something that we know will have a fair amount of data in it.  Githubs Rest Api:

 $a = 'test'
$b = 'test'
Compare-Variables -Reference $a -difference $b
True
$a = Invoke-RestMethod -uri http://api.github.com
$b = Invoke-RestMethod -Uri http://api.github.com
Compare-Variables -Reference $a -difference $b
True
$a = Invoke-RestMethod -Uri http://api.github.com/emojis
$b = Invoke-RestMethod -uri http://api.github.com
 Compare-Variables -Reference $a -difference $b
Flase

I hope this helps someone

Until then keep Scripting

Thom

 

 

 

What the Null??

Recently I’ve been working on some code for Querying schedules for SSRS.  I discovered the way that PowerShell passes a null to another function isn’t what the SSRS method expected.

So this started me on what the Heck is PowerShell null really set to?

Based on this blog article we can see I’m not the only one that has this question.   Cody Konior uncovered other ways to declare null:

If you test each of these against PowerShell’s null you get a false:

whattheNull

If we use some of the other comparisons maybe we’ll get to what $null is really set to:

PS ps:\> $b = $null
PS ps:\> [string]::IsNullOrWhiteSpace($b)
True
PS ps:\> [string]::IsNullOrWhiteSpace($b)
True
PS ps:\>

These evaluate the value you’d expect all $true.   I for sure don’t know the language as well as Kirk Munro (@Poshoholic).  He pointed me to a class that I used to compare to  PowerShell’s Null and it came up true:

[System.Management.Automation.Internal.AutomationNull]::Value

In a Blog post about a issue around null it’s explained this way by Jason Shirk (@lzybkr):

ShirkNull

Now I can test for PowerShell’s null, and this explains why $null is not Equal to the C# equivalent.

$Null -eq [System.Management.Automation.Internal.AutomationNull]::Value

True

Yet more detail on the why $Null is different in PowerShell (a more detailed example).

Moral of the story if you are calling a method that expects a $null make certain you get the Right $null for the Method you are calling.

 


I hope this helps someone

Until then keep Scripting

Thom

Creating a DataDriven Subscription from a file

In working with SSRS I found that if I wanted to do a delete of an RDL.  Problem is if I delete an RDL the Subscriptions are deleted as well.   So since I’ve put most all of my SSRS File management and Data source creation in Continuous Integration I needed a way to save off subscriptions before I attempted a delete.  I talk about how that is done in this post Saving SSRS Subscriptions to File.  This post will be about how I consume the saved off files and put the subscription in place in another environment.

If you’ve been following my other blog posts on SSRS you’ll know I’ve written about creating an SSRS Datasource. Testing an SSRS Datasource in each of these scripts I start with the following :  Reportservice2010.asmx Webservice to get to any functions that are needed to operate on SSRS. I assume you’ve read one of these articles and that you need to get a proxy to this.

Onto the method that we’ll call to recreate the DataDriven Subscription:

CreateDataDrivenSubscription A call to this method requires the following classes passed to it:

Path to where the report is : item

ExtensionSetttings – An ExtensionSettings object that contains a list of settings that are specific to the delivery extension.

DataRetrievalPlan – Type: ReportService2010.DataRetrievalPlan
A DataRetrievalPlan object that provides settings that are required to retrieve data from a delivery query. The DataRetrievalPlan object contains a reference to a DataSetDefinition object and a DataSourceDefinitionOrReference object.

description – Type: System.String A meaningful description that is displayed to users.

eventtype – Type: System.String
The type of event that triggers the data-driven subscription. The valid values are TimedSubscription or SnapshotUpdate

Matchdata – Type: System.String
The data that is associated with the specified EventType parameter. This parameter is used by an event to match the data-driven subscription with an event that has fired.

paramatervalueorfieldreference – Type: ReportService2010.ParameterValueOrFieldReference[]
An array of ParameterValueOrFieldReference objects that contains a list of parameters for the item.

Since I’m using a Powershell Class one of the first things I must make sure I do is have a proxy to the WebService I want to call. Everything else will fail if i haven’t done that first. Then I’ll read in my reportFiles that I want to operate on With Get-childitem.  I chose to use the xml export method.


$ssrsproxy = New-WebServiceProxy -Uri http://yourwebsite/yourreports/_vti_bin/ReportServer/ReportService2010.asmx -UseDefaultCredential -namespace 'SSRSProxy' -class 'ReportService2010'
$reportExportPath = 'C:\temp\reports3'
$reportFiles = Get-ChildItem $reportExportPath -Filter *.xml

Now that I have My reports that I want to operate on I can iterrate through each of these objects and rebuild the object and submit it to the new server.   So that I can change url’s from one server to another I Use a $source and $destination variable to assist.

Since each saved off report has 7 items in it we have to go through each item (object) in the xml and convert them back to the same type of object that the method call expects. So the first Object that we need to rebuild that is a little more complex is the $extensionsettings.  When you look at the extension settings they can have two other objects inside the extension setting. One of them is a ParameterFieldReference and the other is a Parametervalue.  So we have to test each of the names of the object properties to see what the name is so we know whether to build a ParameterFieldReference or a Parametervalue.


foreach($parameterField in $reportobject.extensionSettings.ParameterValues)
{
if($parameterfield.psobject.Properties.name[0] -eq 'ParameterName') #rebuild the object into an extenstion setting this one contains a parameter field reference
{
$a = [SSRSProxy.ParameterFieldReference]::new()
Write-Verbose 'Create a object of type ParameterField reference.'
$a.FieldAlias = $parameterfield.fieldalias
$a.ParameterName = $parameterfield.ParameterName
}
elseif($parameterfield.psobject.Properties.name[0] -eq 'Name') #rebuild the object into an extension settings object this one contains a param value
{
$a = [SSRSProxy.ParameterValue]::New()
Write-Verbose 'Create a object of type ParameterValue reference.'
$a.Label = $parameterField.Label
$a.Name = $parameterField.Name
$a.Value = $parameterField.Value
}
$paramvalues += $a
}
$extensionSettings.ParameterValues = [ssrsproxy.parametervalueorfieldreference[]]$paramvalues

Once we’ve built the extensionsettings we now need to rebuild the DataRetrievalPlan . The data retreival plan includes the reference to the new location for the DataSource.  this is where we use the source and destination to our advantage. This is done by settting the item on the dataretrieval plan to the datasource reference we wish to use using the object DataSourceReference


[SSRSProxy.DataRetrievalPlan]$DataRetrievalPlan = New-Object SSRSProxy.DataRetrievalPlan
 Write-Verbose 'Create a object of type DataRetrievalPlan reference.'

 $DataRetrievalPlan.DataSet = $reportobject.DataRetrievalPlan.DataSet

 $DataRetrievalPlan.DataSet = $reportobject.DataRetrievalPlan.DataSet
 [SSRSProxy.DataSourceReference]$dsReference = $reportobject.DataRetrievalPlan.Item
 $src = ([uri]$source).absoluteuri
 $dest = ([uri]$destination).absoluteuri
 $dsReference.Reference = (([uri]$dsReference.Reference).AbsoluteUri) -replace $src,$dest
 Write-Verbose "Datasource Reference $dsreference use the value for the datasource you want this data driven report to consume"
 $DataRetrievalPlan.Item = $dsReference

now the last few bits of information that need to be added is the ParameterValueorFieldReference and the report description, eventtype and match data.


$description = $reportobject.Description
 $eventtype = $reportobject.eventtype
 $matchdata = $reportobject.matchdata

$b = [Ssrsproxy.parameterfieldreference]::new()
 Write-Verbose 'Create a object of type parameterfieldreference reference.'
 $b.FieldAlias = $reportobject.parameters.fieldalias
 $b.ParameterName = $reportobject.parameters.ParameterName
 [SSRSProxy.ParameterValueOrFieldReference]$ParameterValueOrFieldReference = $b

Now that we have those set the last thing for us to do is to set the destination for the report where this should go.  Then we’ll call the method and hope we don’t hit an exception.


$itemPath = "$destination/$($reportobject.subscription.report)"
 try
 {
 Write-Verbose "Now that the object is re-constituted we can put this in the SSRS instance we wish to push it to"
 $ssrsproxy.CreateDataDrivenSubscription($itempath , $extensionsettings , $DataRetrievalPlan, $description, $eventtype, $matchdata, $ParameterValueOrFieldReference) 
 }
 Catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }

 

Full script for this follows:

Write-Verbose " Destination for where the saved subscriptions will be pushed to"
$destination = 'http://yourwebsite/sites/datasourcetest/Shared%20Documents'
$source = 'http://yourwebsite/sites/Reports/Shared%20Documents'

$reportFiles = Get-ChildItem $reportExportPath -Filter *.xml
foreach($file in $reportFiles)
{
 $reportobject = Import-Clixml -path ($file.fullname)
 Write-Verbose "Create a object of type ExtensionSettings"
 $extensionSettings = New-Object -typename 'SSRSProxy.ExtensionSettings' 
 $extensionSettings.Extension = $reportobject.extensionSettings.Extension
 $paramvalues = @()
 foreach($parameterField in $reportobject.extensionSettings.ParameterValues)
 {
 if($parameterfield.psobject.Properties.name[0] -eq 'ParameterName') #rebuild the object into an extenstion setting this one contains a parameter field reference
 {
 $a = [SSRSProxy.ParameterFieldReference]::new()
 Write-Verbose 'Create a object of type ParameterField reference.'
 $a.FieldAlias = $parameterfield.fieldalias
 $a.ParameterName = $parameterfield.ParameterName
 }
 elseif($parameterfield.psobject.Properties.name[0] -eq 'Name') #rebuild the object into an extension settings object this one contains a param value
 {
 $a = [SSRSProxy.ParameterValue]::New()
 Write-Verbose 'Create a object of type ParameterValue reference.'
 $a.Label = $parameterField.Label
 $a.Name = $parameterField.Name
 $a.Value = $parameterField.Value
 }
 $paramvalues += $a
 }
 $extensionSettings.ParameterValues = [ssrsproxy.parametervalueorfieldreference[]]$paramvalues

 [SSRSProxy.DataRetrievalPlan]$DataRetrievalPlan = New-Object SSRSProxy.DataRetrievalPlan
 Write-Verbose 'Create a object of type DataRetrievalPlan reference.'

 $DataRetrievalPlan.DataSet = $reportobject.DataRetrievalPlan.DataSet
 [SSRSProxy.DataSourceReference]$dsReference = $reportobject.DataRetrievalPlan.Item
 $src = ([uri]$source).absoluteuri
 $dest = ([uri]$destination).absoluteuri
 $dsReference.Reference = (([uri]$dsReference.Reference).AbsoluteUri) -replace $src,$dest
 Write-Verbose "Datasource Reference $dsreference use the value for the datasource you want this data driven report to consume"
 $DataRetrievalPlan.Item = $dsReference
 $description = $reportobject.Description
 $eventtype = $reportobject.eventtype
 $matchdata = $reportobject.matchdata

 $b = [Ssrsproxy.parameterfieldreference]::new()
 Write-Verbose 'Create a object of type parameterfieldreference reference.'
 $b.FieldAlias = $reportobject.parameters.fieldalias
 $b.ParameterName = $reportobject.parameters.ParameterName
 [SSRSProxy.ParameterValueOrFieldReference]$ParameterValueOrFieldReference = $b

 $itemPath = "$destination/$($reportobject.subscription.report)"
 try
 {
 Write-Verbose "Now that the object is re-constituted we can put this in the SSRS instance we wish to push it to"
 $ssrsproxy.CreateDataDrivenSubscription($itempath , $extensionsettings , $DataRetrievalPlan, $description, $eventtype, $matchdata, $ParameterValueOrFieldReference) 
 }
 Catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }
}

I hope this helps someone

Until then keep Scripting

Thom

Saving SSRS Subscriptions to File

In working with SSRS I found that if I wanted to do a delete of an RDL.  Problem is if I delete an RDL the Subscriptions are deleted as well.   So since I’ve put most all of my SSRS File management and Data source creation in Continuous Integration I needed a way to save off subscriptions before I attempted a delete.  This article is about the PowerShell I wrote to accomplish this task.

If you’ve been following my other blog posts on SSRS you’ll know I’ve written about creating an SSRS Datasource. Testing an SSRS Datasource in each of these scripts I start with the following :  Reportservice2010.asmx Webservice to get to any functions that are needed to operate on SSRS. I assume you’ve read one of these articles and that you need to get a proxy to this.

Onto the method that we’ll call to get the two different types of SSRS Subscriptions:

Normal subscriptions & DataDriven Subscriptions

For a normal subscription you’ll need to call the method listSubscriptions . This method expects the name of the report that you wish to get the subscriptions for.  To get the names of the reports we’ll use the .listChildren method and list all the children of the current site and then find each subscription for each report.

function Get-Subscriptions
{
  param([object]$ssrsproxy, [string]$site, [switch]$DataDriven)
  write-verbose 'Path to where the reports are must be specified to get the subscriptions you want.. Root (/) does not seem to get everything'
  $items = $ssrsproxy.ListChildren($site,$true) | Where-Object{$_.typename -eq 'report'}
  $subprops = $ddProps= @()

Now that we have the list of reports in the $items variable we can now ask for the subscription foreach item


foreach($item in $items)
 {
 $subs = $ssrsproxy.ListSubscriptions($item.path)

}

Now that we have our Subscriptions in the $subs variable we can now check to see if there was a return result if there was a return result we can then the the properties of each subscription type. We’ll know that we have a DataDriven subscripition by the property each of the subscriptions in the $subs array (.isdatadriven -eq $true)


 if($subs)
 {
   foreach($sub in $subs)
   {

    if($sub.isdatadriven -eq 'true')
      {
        $ddProps += Get-DataDrivenSubscriptionProperties -subscription $sub -ssrsproxy $ssrsproxy
      }
      elseif(-not $DataDriven)
      {
        $subProps += Get-SubscriptionProperties -subscriptionid $sub -ssrsproxy $ssrsproxy
      }

    }
 }
 if($DataDriven)
 {$ddProps}
 else {
 $subprops
 }

Now Onto explaining the Get-DataDrivenSubscriptionProperties and the Get-Subscription Properties.

The Get-SubscriptionProperties calls the method GetSubcriptionProperties with the ID of the subscription and then returns 7 objects through a reference.   In the function below I first set all the 7 reference variables to null then I call the method.  On successful return from the method I add to my powershell object [SSRSObject] (which is a  powershell class). If I choose to not use a class and instead want to use a standard object the standard object code is commented out so that it can be used if needed.


function Get-SubscriptionProperties
{
 param([string]$Subscription,
 [object]$ssrsproxy)
 $subextensionSettings = $subDataRetrievalPlan = $subDescription = $subactive = $substatus = $subeventtype = $submatchdata = $subparameters = $Null

 $subOwner = $ssrsproxy.GetSubscriptionProperties($subscription.SubscriptionID,[ref]$subextensionSettings,[ref]$subDescription,[ref]$subactive,[ref]$substatus,[ref]$subeventtype,[ref]$submatchdata,[ref]$subparameters)
 $ssrsobject = [SSRSObject]::New()
 $ssrsobject.subscription = $Subscription
 $ssrsobject.Owner = $subOwner
 $ssrsobject.ExtensionSettings = $subextensionSettings
 $ssrsobject.Description = $subDescription
 $ssrsobject.DataRetrievalPlan = $subDataRetrievalPlan
 $ssrsobject.Active = $subactive
 $ssrsobject.Status = $substatus
 $ssrsobject.EventType = $subeventtype
 $ssrsobject.MatchData = $submatchdata
 $ssrsobject.Parameters = $subparameters
 <#
 [PSCustomObject]@{
 'Owner' = $subOwner
 'extensionSettings' = $subextensionSettings
 'Description' = $subDescription
 'active' = $subactive
 'status' =$substatus
 'eventtype' =$subeventtype 
 'matchdata' = $submatchdata
 'parameters' = $subparameters
 }
 #>
}

For calling the Get-DataDrivenSubscriptionProperties we do all the same things as with the previous subscription type.  We call the method GetDataDrivenSubscriptionProperties it returns the same 7 reference objects.  On successful return from the method I add to my powershell object [SSRSObject] (which is a  powershell class). If I choose to not use a class and instead want to use a standard object the standard object code is commented out so that it can be used if needed.


function Get-DataDrivenSubscriptionProperties 
{
 param([object] $Subscription,
 [object]$ssrsproxy)
 $ssrsobject = [SSRSObject]::New()
 $sid = $Subscription.SubscriptionID
 $ddextensionSettings = $ddDataRetrievalPlan = $ddDescription = $ddactive = $ddstatus = $ddeventtype = $ddmatchdata = $ddparameters = $Null
 $ddOwner = $ssrsproxy.GetDataDrivenSubscriptionProperties($sid,[ref]$ddextensionSettings,[ref]$ddDataRetrievalPlan`
 ,[ref]$ddDescription,[ref]$ddactive,[ref]$ddstatus,[ref]$ddeventtype,[ref]$ddmatchdata,[ref]$ddparameters)

 $ssrsobject.subscription = $Subscription
 $ssrsobject.Owner = $ddOwner
 $ssrsobject.ExtensionSettings = $ddextensionSettings
 $ssrsobject.Description = $ddDescription
 $ssrsobject.DataRetrievalPlan = $ddDataRetrievalPlan
 $ssrsobject.Active = $ddactive
 $ssrsobject.Status = $ddstatus
 $ssrsobject.EventType = $ddeventtype
 $ssrsobject.MatchData = $ddmatchdata
 $ssrsobject.Parameters = $ddparameters
 $ssrsobject
 <# [PSCustomObject]@{
 'Owner' = $ddOwner
 'extensionSettings' = $ddextensionSettings
 'DataRetrievalPlan' = $ddDataRetrievalPlan
 'Description' = $ddDescription
 'active' = $ddactive
 'status' =$ddstatus
 'eventtype' =$ddeventtype 
 'matchdata' = $ddmatchdata
 'parameters' = $ddparameters
 } #>
}

Now that I have each of the reports in an object  I now persist this to disk with either Export-clixml or with convertto-json cmdlets

function New-XMLSubscriptionfile
{
[CmdletBinding()]
[Alias()]
param([psobject]$subscriptionObject, [string]$path)
if(test-path $path -PathType Leaf)
{
$path = split-path $path

}
if(-not(test-path $path))
{
mkdir $path
}
foreach($sub in $subscriptionObject)
{
$reportName = (($sub.subscription.report).split('.'))[0]
$filename = "$path\$reportName.xml"
$sub | Export-Clixml -Depth 100 -path $filename
}
}

function New-JsonSubscriptionFile
{
[CmdletBinding()]
[Alias()]
param([psobject]$subscriptionObject, [string]$path)
if(test-path $path -PathType Leaf)
{
$path = split-path $path

}
if(-not(test-path $path))
{
mkdir $path
}
foreach($sub in $subscriptionObject)
{
$reportName = (($sub.subscription.report).split('.'))[0]
$filename = "$path\$reportName.json"
$sub | convertto-json -Depth 100 | out-file $filename
}
}

To see the entire script see this Gist

I hope this helps someone

Until then keep Scripting

Thom

Profile creation with PowerShell and the community

I was asked to see if I could create a script to create a Users profile without the user being logged in.

So I searched Bing’d and Goog’d and couldn’t find a PowerShell Module where you could do that.  I then began the process of searching for folks that could get me started. Once I got the “starting” information I was able to put a script together.   this led me to a base script to create a user profile with Pinvoke . when I first put this code together it caused ISE / Powershell to Crash. So then I was again perplexed as to what do I do now.   So I Posted a question on it and thankfully someone else had started working on the same thing    .  He gave me a working way to get around the crashes I was experiencing with his script .  Now on to what and how it works.

The main task was to create a profile so I’ll explain that first.

In order to use the interopservices / pinvoke I had to bring in System.runtime.interopservices.

thankfully Adamdriscoll did all the heaving lifting with his scripting that creates this type:


 Add-Type -TypeDefinition '
 using System;
 using System.Runtime.InteropServices;
 public static class PInvoke {
 [DllImport("userenv.dll", SetLastError = true, CharSet = CharSet.Auto)]
 public static extern int CreateProfile( [MarshalAs(UnmanagedType.LPWStr)] String pszUserSid, [MarshalAs(UnmanagedType.LPWStr)] String pszUserName, [Out][MarshalAs(UnmanagedType.LPWStr)] System.Text.StringBuilder pszProfilePath, uint cchProfilePath);
 }
 '

The next step was how to call that added type with the proper information.


$pszProfilePath = new-object -typename System.Text.StringBuilder
[int]$results = [PInvoke]::CreateProfile($UserSid, $UserName, $pszProfilePath, $ProfilePath)
}
$stringbuff = new-object system.text.stringbuilder(260)
[system.uint32]$a =$stringbuff.capacity
$sid = ((get-aduser -id 'brtestlocaluser').sid.value)
CreateProfile -usersid $sid -username 'brtestlocaluser' -ProfilePath $a

Here is where I found this code caused my ise and powershell process to crash.

function CreateProfile
{
param([String]$UserSid, [String]$UserName, [system.uint32]$ProfilePath)
Add-Type -TypeDefinition '
using System;
using System.Runtime.InteropServices;
public static class PInvoke {
[DllImport("userenv.dll", SetLastError = true, CharSet = CharSet.Auto)]
public static extern int CreateProfile( [MarshalAs(UnmanagedType.LPWStr)] String pszUserSid, [MarshalAs(UnmanagedType.LPWStr)] String pszUserName, [Out][MarshalAs(UnmanagedType.LPWStr)] System.Text.StringBuilder pszProfilePath, uint cchProfilePath);
}
'
$pszProfilePath = new-object -typename System.Text.StringBuilder
[int]$results = [PInvoke]::CreateProfile($UserSid, $UserName, $pszProfilePath, $ProfilePath)
}
$stringbuff = new-object system.text.stringbuilder(260)
[system.uint32]$a =$stringbuff.capacity
$sid = ((get-aduser -id 'brtestlocaluser').sid.value)
CreateProfile -usersid $sid -username 'brtestlocaluser' -ProfilePath $a

So with that in mind I sent out some Tweets to find out why this was crashing and I came across . He had already done some of the work to allow for a Pinvoke to be called.  What he did differently than what I was doing was to “wrap” the Pinvoke  in a Script Scope.  So the code I’m showing above ended up in being two functions one to register the native method the other function to add the native method.

function Register-NativeMethod
{
[CmdletBinding()]
[Alias()]
[OutputType([int])]
Param
(
# Param1 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
[string]$dll,

# Param2 help description
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=1)]
[string]
$methodSignature
)

$script:nativeMethods += [PSCustomObject]@{ Dll = $dll; Signature = $methodSignature; }
}

Adding the Native Method:

function Add-NativeMethods
{
    [CmdletBinding()]
    [Alias()]
    [OutputType([int])]
    Param($typeName = 'NativeMethods')

    $nativeMethodsCode = $script:nativeMethods | ForEach-Object { "
        [DllImport(`"$($_.Dll)`")]
        public static extern $($_.Signature);
    " }

    Add-Type @"
        using System;
        using System.Text;
        using System.Runtime.InteropServices;
        public static class $typeName {
            $nativeMethodsCode
        }
"@
}

Now to show how they are called in the new function that creates a user profile.  The first thing that is done is we try and see if the user that we need to create a profile for is Local to the machine.

New-LocalUser -username $UserName -password $Password;

If that user is local then we goto the new-localuser function in the same script.  Once that completes we are on to the Pinvoke code. First we declare a name for our method to be from the Pinvoke. In this case it’s going to be USERENVCP.  Then we see if it is already declared with the If statement:

$methodName = 'UserEnvCP'
    $script:nativeMethods = @();

    if (-not ([System.Management.Automation.PSTypeName]$MethodName).Type)
    {

If it’s not in our session then here is where we are going to use the functions described above to get our Pinvoke registered in our session.  So now we call Register-NativeMethod with our Dll and the method signature to register it.  Then immediately after that we add the native method so we can call it.

Register-NativeMethod "userenv.dll" "int CreateProfile([MarshalAs(UnmanagedType.LPWStr)] string pszUserSid,`
         [MarshalAs(UnmanagedType.LPWStr)] string pszUserName,`
         [Out][MarshalAs(UnmanagedType.LPWStr)] StringBuilder pszProfilePath, uint cchProfilePath)";

        Add-NativeMethods -typeName $MethodName;

With the $methodname added now we can call it and create our profile:

    try
    {
        [UserEnvCP]::CreateProfile($userSID.Value, $Username, $sb, $pathLen) | Out-Null;
    }

Full code for the explained function is below:

function Create-NewProfile {

    [CmdletBinding()]
    [Alias()]
    [OutputType([int])]
    Param
    (
        # Param1 help description
        [Parameter(Mandatory=$true,
                   ValueFromPipelineByPropertyName=$true,
                   Position=0)]
        [string]$UserName,

        # Param2 help description
        [Parameter(Mandatory=$true,
                   ValueFromPipelineByPropertyName=$true,
                   Position=1)]
        [string]
        $Password
    )

    Write-Verbose "Creating local user $Username";

    try
    {
        New-LocalUser -username $UserName -password $Password;
    }
    catch
    {
        Write-Error $_.Exception.Message;
        break;
    }
    $methodName = 'UserEnvCP'
    $script:nativeMethods = @();

    if (-not ([System.Management.Automation.PSTypeName]$MethodName).Type)
    {
        Register-NativeMethod "userenv.dll" "int CreateProfile([MarshalAs(UnmanagedType.LPWStr)] string pszUserSid,`
         [MarshalAs(UnmanagedType.LPWStr)] string pszUserName,`
         [Out][MarshalAs(UnmanagedType.LPWStr)] StringBuilder pszProfilePath, uint cchProfilePath)";

        Add-NativeMethods -typeName $MethodName;
    }

    $localUser = New-Object System.Security.Principal.NTAccount("$UserName");
    $userSID = $localUser.Translate([System.Security.Principal.SecurityIdentifier]);
    $sb = new-object System.Text.StringBuilder(260);
    $pathLen = $sb.Capacity;

    Write-Verbose "Creating user profile for $Username";

    try
    {
        [UserEnvCP]::CreateProfile($userSID.Value, $Username, $sb, $pathLen) | Out-Null;
    }
    catch
    {
        Write-Error $_.Exception.Message;
        break;
    }
}

Many thanks to the members of the community that helped me with getting this script built and working (@Ms_dminstrator, @adamdriscoll )…. The entire script can be found on my gist here:

I hope this helps someone

Until then keep Scripting

Thom

Adding a user to a Group in Dynamics CRM 2016

Recently I’ve had to add a user to Dynamics CRM 2016. This post is about how I did that with a module from the PowerShell Gallery.

First thing I needed to do was to find something that was available for use against Dynamics CRM 2016.  So I searched PowerShell gallery and found this module: Microsoft.Xrm.Data.Powershell.  In addition to this module I found some handy samples to work with this module found here:  Microsoft.Xrm.Data.PowerShell.Samples 

I was able to take these samples and come up with a usable script to add users to a group in this application.   My purpose was to do the un-thinkable add users to a Admin group.   While probably not the best thing to do, in my situation it was what I needed to do.  So here is how I began.

I looked at this sample: UpdateCrmUserSettings.ps1

This helped me immensely in figuring out how I about connecting to my crm instance:


$adminUserCredentials = get-credential

$organizationName = 'MyOrg'

$serverUrl = 'http://mycrmserver.mycompany.com:80'

$loadedandCorrectVersion = (get-command -module 'Microsoft.Xrm.Data.Powershell' -ErrorAction Ignore).version -eq '2.5'
if(-not $loadedandCorrectVersion)
{
find-module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 | Install-Module -Scope CurrentUser -AllowClobber -Force
Import-Module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 -Force -RequiredVersion 2.5
}

$xcrmConn = Get-CrmConnection -OrganizationName $OrganizationName -ServerUrl $ServerUrl  -Credential $AdminUserCredentials 

I added some “plumbing” to try and force PowerShell to ensure that I only have version 2.5 of the module downloaded and imported into the session where I’m going to run this. The $xcrmConn will be the connection that we use to call every subsequent function in this update of our user.  According to the documentation you can specify this Connection as a global variable, I chose to not do this so that I could understand what is going on from each call I make to this module.

The next task was to try and figure out how to get all the users.  There are a bunch of different cmdlets that are there — From Get-CrmCurrentUserId to Get-MyCrmUserId as you can see below:


ps:\crm> get-command get*crm*user*

CommandType Name Version Source
----------- ---- ------- ------
Alias Get-CrmCurrentUserId 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserMailbox 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserPrivileges 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserSecurityRoles 2.5 Microsoft.Xrm.Data.Powershell
Function Get-CrmUserSettings 2.5 Microsoft.Xrm.Data.Powershell
Function Get-MyCrmUserId 2.5 Microsoft.Xrm.Data.Powershell

None of them really seemed to deal or make sense out of how do I get all users or a specific user.. That is when I turned again back to the samples and found this Command

Get-CrmRecords

What I discovered is that you have to understand how to use the filters. The first thing  I tried was to get all the users in CRM.


$users = Get-CrmRecords -EntityLogicalName systemuser -conn $xcrmconn -Fields systemuserid,fullname

After several runs(trial and error) I was able to get to a workable call to the get-crmRecords. For an individual User.

In order to add a user as an admin we’ll need to get the user’s id.  Not only that we’ll also need to get the id of the Security Role that we are going to add them to.


$sysAdminrole = 'System Administrator'
#user we are going to filter on must be in the syntax of last, first
$userObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$domainUsername" -Fields domainname,fullname

$userGuid = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$userName = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
if($userGuid)
{$userRoles = (Get-CrmUserSecurityRoles -conn $xcrmconn -UserId $userGuid).roleid.guid}
else
{
Throw "$DomainUsername not found in $ServerUrl and Organization $OrganizationName"
}

$adminObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$($AdminUserCredentials.username)" -Fields domainname,fullname
#get the admins guid and user name
$adminId = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$AdminUserName = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
$adminRoleObject = Get-CrmUserSecurityRoles -conn $xcrmConn -UserId $adminId | Where-Object {$_.rolename -eq $sysAdminrole}
$adminroles = ($adminRoleObject).roleid.guid
$adminRoleName = $adminroleobject.rolename

Now that I have the required items for adding the role.  All i need to do is make sure that the role isn’t already there. Then add the Security Role ID to the user.  Now you have a user that has the System Admin  role added to it.

Full Script Follows:

#requires -module PowerShellGet
<# .SYNOPSIS A brief description of the updateusers.ps1 file. .DESCRIPTION A detailed description of the updateusers.ps1 file. .PARAMETER ServerUrl A description of the ServerUrl parameter. .PARAMETER OrganizationName Organization name in CRM For Example: yourorg .PARAMETER UserName2Add User name to add as an Admin To Crm For example Schumacher, Thomas .PARAMETER AdminUserCredentials Credentials that has admin privledges to the url passed. .EXAMPLE PS C:\> .\updateusers.ps1 -UserName2Add ‘Value1’ -xcrmCred (Get-Credential)

.NOTES
Additional information about the file.
#>
param
(
[string]$ServerUrl = ‘http://yourCrminstance.yourname.com:80&#8217;,
[string]$OrganizationName = ‘YourInstance’,
[Parameter(Mandatory = $true)]
[string]$DomainUsername =’domain\domainuser’,
[pscredential]$AdminUserCredentials = (Get-Credential)
)
$loadedandCorrectVersion = (get-command -module ‘Microsoft.Xrm.Data.Powershell’ -ErrorAction Ignore).version -eq ‘2.5’
if(-not $loadedandCorrectVersion)
{
find-module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 | Install-Module -Scope CurrentUser -AllowClobber -Force
Import-Module -Name Microsoft.Xrm.Data.Powershell -MinimumVersion 2.5 -MaximumVersion 2.5 -Force -RequiredVersion 2.5
}
if(get-command -module ‘Microsoft.Xrm.Data.Powershell’)
{
$xcrmConn = Get-CrmConnection -OrganizationName $OrganizationName -ServerUrl $ServerUrl -Credential $AdminUserCredentials -Verbose
#https://github.com/seanmcne/Microsoft.Xrm.Data.PowerShell.Samples/blob/master/Microsoft.Xrm.Data.PowerShell.Samples/UpdateCrmUsersSettings/UpdateCrmUsersSettings.ps1

#get the necessary object for the admin
$sysAdminrole = ‘System Administrator’
#user we are going to filter on must be in the syntax of last, first
$userObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$domainUsername" -Fields domainname,fullname
$userGuid = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$userName = (($userObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
if($userGuid)
{$userRoles = (Get-CrmUserSecurityRoles -conn $xcrmconn -UserId $userGuid).roleid.guid}
else
{
Throw "$DomainUsername not found in $ServerUrl and Organization $OrganizationName"
}
$adminObject = Get-CrmRecords -conn $xcrmconn -EntityLogicalName systemuser -FilterOperator eq -FilterAttribute domainname -FilterValue "$($AdminUserCredentials.username)" -Fields domainname,fullname
#get the admins guid and user name
$adminId = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).systemuserid.guid
$AdminUserName = (($adminObject | Where-Object {$_.keys -eq "crmRecords"}).values).fullname
$adminRoleObject = Get-CrmUserSecurityRoles -conn $xcrmConn -UserId $adminId | Where-Object {$_.rolename -eq $sysAdminrole}
$adminroles = ($adminRoleObject).roleid.guid
$adminRoleName = $adminroleobject.rolename
if($adminRoleName -eq $sysAdminrole)
{
if($userroles -like $adminroles)
{
Write-Output "$DomainUsername is already an admin"
}
else
{
Add-CrmSecurityRoleToUser -conn $xcrmconn -UserId $userGuid -SecurityRoleId $adminId
Write-Output "$DomainUsername Added to AdminRole $adminRoleName"
}
}
else
{
Write-Warning "The $($AdminUserCredentials.username) doesn’t have the Role of ‘System Administrator’"
}
}
else
{ throw "cannot load the powershell module ‘Microsoft.Xrm.Data.Powershell’"}

I hope this helps someone

Until then keep Scripting

Thom

Backing up TFS Build and Release Definitions

We’ve chosen to use TFS 2015update3 to release and build our code.  As such I’ve found that I have a great many build and releases that are configured in TFS that I often make changes to.  So I’d like to keep a running set of configurations so that I can restore from a previous copy.  This post is about how I did this backup of build and release definitions with Powershell.

To begin with I need to figure out how I call TFS with it’s api.  Since all of these calls are made via rest we can use PowerShell’s cmdlet Invoke-RestMethod.  So the first thing I need to do is get all my projects so I can loop through them.

function Get-TFSProjects
{
 Param($tfsUrl,$apiversion = '3.0-preview')
 Invoke-RestMethod -method get -UseDefaultCredentials -uri "$tfsurl/_apis/projects?api-version=$apiversion"
} &nbsp;

So to call this all I need to do is call my function:

 get-TFSProjects -tfsurl "http://mytfsinstance.com:8080/tfs/defaultcollection" &nbsp;

Which then returns me an object with the projects:

count value 
----- ----- 
 18 {@{id=[guid]; name=Marketing; url=http://mytfsinstance.com:8080/tfs/defaultcollection/_apis/projects/[guid]; state=wellFormed; revision=3887163}, @{id=83ed4f89- 

This indicates a guid that is specific to my instance of tfs.[guid].

Now that I have the projects for my collection/instance of tfs. I can now loop through them and get each build definition, all my function expects is my $tfsprojects that  I gathered from the first function:


function Backup-TFSbuildDefinitions
{
 param([object]$tfsprojs,$tfsurl = 'mytfsinstance.com', $tfscollection = 'Defaultcollection', $apiversion = '3.0-preview',[string]$Path = 'c:\temp\tfsbuilds')
 #"$Uri/$DefaultCollection/$TeamProject/_apis/build/definitions?api-version=2.0&name=$buildName"
 $tfsInstance = "http://$($tfsurl):8080/tfs/$tfsCollection"

foreach($tfsproj in $tfsprojs.value)
 { 
 $tfsProjName = $tfsproj.name
 $tfsdev = "$tfsInstance/$tfsProjName"
 $projectIds = Invoke-RestMethod -Method Get -UseDefaultCredentials -uri "$tfsdev/_apis/build/definitions?api-version=$apiversion" -ContentType application/json 
 foreach($projectid in $projectIds)
 {
 $relnumber = $projectid.value.id 
 foreach($rel in $relnumber)
 {
 $relDef = invoke-restmethod -method get -UseDefaultCredentials -uri "$tfsdev/_apis/build/definitions/$($rel)?api-version=$apiversion" -ContentType application/json
 $exportPath = "$path/$tfsProjName"
 if(-not (test-path $exportPath))
 {
 mkdir $exportPath
 }
 $jsonDoc = $reldef | convertto-json -Depth 100
 $jsonDoc | out-file -FilePath "$exportPath\$($reldef.name).json"
 }

 }
 }

}

Now all I need to do is look at the api reference and use the appropriate call to get the release configuration.


function Backup-TFSReleaseDefinitions
{
 param([object]$tfsprojs,$tfsurl = 'mytfsinstance.com', $tfscollection = 'Defaultcollection', $apiversion = '3.0-preview',[string]$Path = 'c:\temp\tfsprojects')
 $tfsInstance = "http://$($tfsurl):8080/tfs/$tfsCollection"

foreach($tfsproj in $tfsprojs.value)
 { 
 $tfsProjName = $tfsproj.name
 $tfsdev = "$tfsInstance/$tfsProjName"
 $projectIds = Invoke-RestMethod -Method Get -UseDefaultCredentials -uri "$tfsdev/_apis/release/definitions?api-version=$apiversion" -ContentType application/json 
 foreach($projectid in $projectIds)
 {
 $relnumber = $projectid.value.id 
 foreach($rel in $relnumber)
 {
 $relDef = invoke-restmethod -method get -UseDefaultCredentials -uri "$tfsdev/_apis/release/definitions/$($rel)?api-version=$apiversion" -ContentType application/json
 $exportPath = "$path/$tfsProjName"
 if(-not (test-path $exportPath))
 {
 mkdir $exportPath
 }
 $jsonDoc = $reldef | convertto-json -Depth 100
 $jsonDoc | out-file -FilePath "$exportPath\$($reldef.name).json"
 }

 }
 }

}

Now if I stitch it all together I can call my tfs instance and save the configurations all to my local disk structure.  From where the script is called.

Here is the full script on my gist:

I hope this helps someone

 

Until then

Keep scripting

 

thom

Adding a Retention Tag / Custom Folder / Exchange

At the company I work for we have begun the task of moving users to Exchange online.  As such we discovered we needed to add a policy that sets the retention policy on a folder to some value specified by the online exchange administrator.  This Post is about how I was able to piece together some scripts  from this post and come up with something where I could apply this policy on any folder I found with a specific name. I by no means am an Exchange expert so bear with me as I do my best to explain.

To start with if we browse to my Exchange and look at compliance management then retention policy’s I’ve set a test retention policy as I want the contents of a folder to be held for X time period.

This is my Retention tag and what I called it TestRetention

2017-02-08 15_12_13-retention tags - Microsoft Exchange.png

Here I’ve associated my tag with my Policy:

2017-02-08-14_56_43-retention-policies-microsoft-exchange

Here I’m showing that my user has the retention policy set that has my tag in it.

2017-02-08 15_11_04-mailboxes - Microsoft Exchange.png

Now onto the scripts that I started Stamping Retention Policy Tag and Script to recreate “managed folders”.

In the example they show you how to connect to the on premise exchange server.  To connect to an exchange online instance just had to modify the code to this:

$ImpersonationCreds = Get-Credential -Message "Enter Credentials for Account with Impersonation Role..."
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri $connectionUri -Authentication Basic -Credential $ImpersonationCreds
Import-PSSession $Session

Where the connection uri is to exchange online: ‘https://outlook.office365.com/powershell-liveid/&#8217;

This session brings in all the cmdlets that I’ll need to use for configuring using what is called PowerShell Implicit remoting. Now since I have the cmdlets for Exchange online I can now work on the mailbox I need to make this change on.

$mailboxes = get-content $TargetMailboxes
   $Version = "Exchange2013_SP1"
    $returnStatus =@()
    Add-Type -Path $ApiPath
    $ExchangeVersion = [Microsoft.Exchange.WebServices.Data.ExchangeVersion]::$Version
    $Service = New-Object Microsoft.Exchange.WebServices.Data.ExchangeService($ExchangeVersion)
    $Creds = New-Object System.Net.NetworkCredential($ImpersonationCreds.UserName, $ImpersonationCreds.Password)
    $RetentionPeriod = New-Object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(0x301A,[Microsoft.Exchange.WebServices.Data.MapiPropertyType]::Integer)
    $RetentionFlags = New-Object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(0x301D,[Microsoft.Exchange.WebServices.Data.MapiPropertyType]::Integer)
    $PolicyTag = New-Object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(0x3019,[Microsoft.Exchange.WebServices.Data.MapiPropertyType]::Binary) 

The $ExchangeVersion sets my version that I’m going to use.   In order to get to the Retention flag and policy I need to declare objects that contain those items shown in the pictures above.  Another good post on how we are connecting and looking to accomplish is posted here. Now onto the meat of the post.  The function I wrote to search for folders in TargetMailbox.

I chose to call this function Get-Mailbox folders. The function expects a Exchange service object, a Valid SMTP Mail box, and a Folder2Find.

The assumption is that whomever is running this script has the proper credentials to get to this mailbox.   To be able to find objects in the mailbox we must get an object that allows us to see the Folderviews and set a value for how many we wish to find. In addition we need to tell the Exchange dll how far to traverse the mailbox this is done by setting an enum value on the FolderView object. Now that we have told the dll that we want a folder vview and we want to traverse the folder view a 1000 deep.  We need to tell the Dll what the folder we want to start with. This is done by creating the folder ID object with the folder root, again using an enum value, this enum value we’ve chosen is the Root folder.

$fvFolderView = new-object Microsoft.Exchange.WebServices.Data.FolderView(1000)
  $fvFolderView.Traversal = [Microsoft.Exchange.WebServices.Data.FolderTraversal]::Deep
  $folderid = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::MsgFolderRoot,$targetMailbox)

Now that we have the object created we need to bind to the folder that was created. This is so that we can call the search method for that folder. This is done by calling the class [Microsoft.Exchange.WebServices.Data.Folder] and the corresponding method Bind.

 $tfTargetFolder = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($service,$folderid) 

This binding allows us to call the method to find the folders that we wish to find. The find folders method expects an object that specifies the folder view which we defined earlier.

 $findFolderResults = $tfTargetFolder.FindFolders($fvFolderView) 

Now all we need to do is go thru each one of the folders  with the $findFolderResults. I chose to retrieve the parent folder and the folder id and the type for the folder of ‘IPF.Note’.

function Get-MailBoxfolders
{
  [CmdletBinding()]
  param
  (
    [Parameter(Mandatory=$true, Position=0, HelpMessage='A service that points to exchange instance you wish to query')]
    [Microsoft.Exchange.WebServices.Data.ExchangeService]$Service,
    [Parameter(Mandatory=$true, Position=1, HelpMessage='A mailbox (smtp) that the service has access to')]
    [string]$targetMailbox,
    [string]$Folder2Find
  )
Write-Verbose -Message "create an object that gets the root folder for the mailbox"
  $fvFolderView = new-object Microsoft.Exchange.WebServices.Data.FolderView(1000)
  $fvFolderView.Traversal = [Microsoft.Exchange.WebServices.Data.FolderTraversal]::Deep
  $folderid = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::MsgFolderRoot,$targetMailbox) 

  $tfTargetFolder = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($service,$folderid)

  $findFolderResults = $tfTargetFolder.FindFolders($fvFolderView)

  foreach($folder in $findFolderResults.Folders){
    if($folder.FolderClass -eq 'IPF.Note')
    {
      $parentfolder = ($findFolderResults.Folders |?{$_.id.uniqueid -eq $folder.ParentFolderId.UniqueId}).displayname
      if(-not $parentfolder)
      {$parentfolder = 'Root'}
      if($Folder2Find)
      {
        if($folder.DisplayName -eq $folder2find)
        {
              [pscustomobject] @{
            'name'= $folder.DisplayName
            'folderid' = $folder.Id.UniqueId
            'ParentFolderName' = $parentfolder
            'ParentFolderId' = $folder.ParentFolderId.UniqueId
            'folderclass' = $folder.FolderClass
          }
        }

      }
      else
      {
        [pscustomobject] @{
          'name'= $folder.DisplayName
          'folderid' = $folder.Id.UniqueId
          'ParentFolderName' = $parentfolder
          'ParentFolderId' = $folder.ParentFolderId.UniqueId
          'folderclass' = $folder.FolderClass
        }
    }
  }
  }
  } 

For the full script source see this Gist 

At the end of the script run this is how my folders look in Outlook based on my tagging.

2017-02-09-09_52_52-unread-mail-outlook

I Hope this helps someone.

Until then

Keep Scripting

thom

WMF 5.1 now available

Richard Siddaway's Blog

The download for WMF 5.1 for down level operating systems is now available:

https://blogs.msdn.microsoft.com/powershell/2017/01/19/windows-management-framework-wmf-5-1-released/

WMF 5.1 can be installed on Windows 7 and 8.1 plus Windows Server 2008 R2, 2012, 2012 R2

Windows 10 and Server 2016 already have PowerShell 5.1 and don’t need this install.

if installing on Windows 7 or Server 2008 R2 the installation process has changed – READ THE RELEASE NOTES OR BE PREPARED FOR A LOT OF EXTRA EFFORT

View original post

Copying PowerShell object

Lately I’ve needed to take a PowerShell object and use it in several places in a JSON document that PowerShell nicely put in a custom object for me.  What I needed this object to do was to allow for a set of each one of the properties and they needed to be different for each time I added it to the JSON object.   To get this to work I tried several different means. This post is about how I  worked to solve this issue.

First we’ll start with a customobject that comes from JSON

$tasks2add = $tasks = $null
$taskjson = @'
[
 {
 "taskId": "1",
 "name": "Server-Scommaintenance",
 "enabled": false,
 "inputs": {
 "servers": "$(serverMonitors) ",
 "webMonitors": "$(webMonitors)",
 "MinuteValue": "2000",
 "maintValue": "inMaint"
 }
 },
 {
 "taskId": "2",
 "name": "Server-Scommaintenance",
 "enabled": false,
 "inputs": {
 "servers": "$(serverMonitors) ",
 "webMonitors": "$(webMonitors)",
 "emailusers": "$(ScomNotify)",
 "MinuteValue": "2000",
 "maintValue": "RemoveMaint"
 }
 }
]
'@
$tasks2add = $taskjson|convertfrom-json

Now if  look at my variable $tasks2Add we’ll see that it has all the items in the custom json above:

$tasks2add = $taskjson|convertfrom-json 

PS PS:\> $tasks2add

taskId name                   enabled inputs                                                                                                                       
------ ----                   ------- ------                                                                                                                       
1      Server-Scommaintenance   False @{servers=$(serverMonitors) ; webMonitors=$(webMonitors); MinuteValue=2000; maintValue=inMaint}                              
2      Server-Scommaintenance   False @{servers=$(serverMonitors) ; webMonitors=$(webMonitors); emailusers=$(ScomNotify); MinuteValue=2000; maintValue=RemoveMaint}

Now if I take that same set of objects and add it to another variable and then set each one. Lets see what the output looks like:

$newArraylist = new-object System.Collections.Generic.List[system.object]
$newArraylist.Add((New-object pscustomobject ($tasks2add[0])))
$newArraylist.Add((New-object pscustomobject ($tasks2add[1])))
$newArraylist.Add((New-object pscustomobject ($tasks2add[0])))
$newArraylist.Add((New-object pscustomobject ($tasks2add[1])))
#$newArraylist.count

$newArraylist[0].enabled = $true
$newArraylist[1].enabled = $false
$newArraylist[2].enabled = $false
$newArraylist[3].enabled = $true
$newArraylist

Here is what my output looks like:

2017-01-24-08_26_03-clipboard

You would expect that the first and second tasks would be set to $true and $false respectively as I set them with the $newArraylist[x].enabled = $true / $false.

So what happened here.  PowerShell takes the array object and points (references) the values in the object to the first created object.  So we aren’t really getting a copy we are getting a reference to the first created object.  After much gnashing of teeth and trying several different methods I finally came to a solution that is described in this PowerShell QA post.

To get this to work in the fashion I wanted which is I want each one of the copy’s of the new object to be settable independently I had to use the psobject property of my custom object. I’ll do this with the method called copy on the psobject property.

$tasks2add[1].PSObject.copy()

This makes the code much shorter and solves my issue where I can now set my custom objects like I’d like them to be.

$newArraylist = new-object System.Collections.Generic.List[system.object]
$newArraylist.Add($tasks2add[0].PSObject.Copy())
$newArraylist.Add($tasks2add[1].PSObject.Copy())
$newArraylist.Add($tasks2add[0].PSObject.Copy())
$newArraylist.Add($tasks2add[1].PSObject.Copy())
#$newArraylist.count

$newArraylist[0].enabled = $true
$newArraylist[1].enabled = $false
$newArraylist[2].enabled = $false
$newArraylist[3].enabled = $true
$newArraylist

Now if I look at my object it is now in the condition I want where I can set each item I add to my array list.

2017-01-24 08_41_04-Clipboard.png

I Hope this helps someone.

Until then

Keep Scripting

thom