Adding a Retention Tag / Custom Folder / Exchange

At the company I work for we have begun the task of moving users to Exchange online.  As such we discovered we needed to add a policy that sets the retention policy on a folder to some value specified by the online exchange administrator.  This Post is about how I was able to piece together some scripts  from this post and come up with something where I could apply this policy on any folder I found with a specific name. I by no means am an Exchange expert so bear with me as I do my best to explain.

To start with if we browse to my Exchange and look at compliance management then retention policy’s I’ve set a test retention policy as I want the contents of a folder to be held for X time period.

This is my Retention tag and what I called it TestRetention

2017-02-08 15_12_13-retention tags - Microsoft Exchange.png

Here I’ve associated my tag with my Policy:

2017-02-08-14_56_43-retention-policies-microsoft-exchange

Here I’m showing that my user has the retention policy set that has my tag in it.

2017-02-08 15_11_04-mailboxes - Microsoft Exchange.png

Now onto the scripts that I started Stamping Retention Policy Tag and Script to recreate “managed folders”.

In the example they show you how to connect to the on premise exchange server.  To connect to an exchange online instance just had to modify the code to this:

$ImpersonationCreds = Get-Credential -Message "Enter Credentials for Account with Impersonation Role..."
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri $connectionUri -Authentication Basic -Credential $ImpersonationCreds
Import-PSSession $Session

Where the connection uri is to exchange online: ‘https://outlook.office365.com/powershell-liveid/’

This session brings in all the cmdlets that I’ll need to use for configuring using what is called PowerShell Implicit remoting. Now since I have the cmdlets for Exchange online I can now work on the mailbox I need to make this change on.

$mailboxes = get-content $TargetMailboxes
   $Version = "Exchange2013_SP1"
    $returnStatus =@()
    Add-Type -Path $ApiPath
    $ExchangeVersion = [Microsoft.Exchange.WebServices.Data.ExchangeVersion]::$Version
    $Service = New-Object Microsoft.Exchange.WebServices.Data.ExchangeService($ExchangeVersion)
    $Creds = New-Object System.Net.NetworkCredential($ImpersonationCreds.UserName, $ImpersonationCreds.Password)
    $RetentionPeriod = New-Object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(0x301A,[Microsoft.Exchange.WebServices.Data.MapiPropertyType]::Integer)
    $RetentionFlags = New-Object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(0x301D,[Microsoft.Exchange.WebServices.Data.MapiPropertyType]::Integer)
    $PolicyTag = New-Object Microsoft.Exchange.WebServices.Data.ExtendedPropertyDefinition(0x3019,[Microsoft.Exchange.WebServices.Data.MapiPropertyType]::Binary) 

The $ExchangeVersion sets my version that I’m going to use.   In order to get to the Retention flag and policy I need to declare objects that contain those items shown in the pictures above.  Another good post on how we are connecting and looking to accomplish is posted here. Now onto the meat of the post.  The function I wrote to search for folders in TargetMailbox.

I chose to call this function Get-Mailbox folders. The function expects a Exchange service object, a Valid SMTP Mail box, and a Folder2Find.

The assumption is that whomever is running this script has the proper credentials to get to this mailbox.   To be able to find objects in the mailbox we must get an object that allows us to see the Folderviews and set a value for how many we wish to find. In addition we need to tell the Exchange dll how far to traverse the mailbox this is done by setting an enum value on the FolderView object. Now that we have told the dll that we want a folder vview and we want to traverse the folder view a 1000 deep.  We need to tell the Dll what the folder we want to start with. This is done by creating the folder ID object with the folder root, again using an enum value, this enum value we’ve chosen is the Root folder.

$fvFolderView = new-object Microsoft.Exchange.WebServices.Data.FolderView(1000)
  $fvFolderView.Traversal = [Microsoft.Exchange.WebServices.Data.FolderTraversal]::Deep
  $folderid = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::MsgFolderRoot,$targetMailbox)

Now that we have the object created we need to bind to the folder that was created. This is so that we can call the search method for that folder. This is done by calling the class [Microsoft.Exchange.WebServices.Data.Folder] and the corresponding method Bind.

 $tfTargetFolder = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($service,$folderid) 

This binding allows us to call the method to find the folders that we wish to find. The find folders method expects an object that specifies the folder view which we defined earlier.

 $findFolderResults = $tfTargetFolder.FindFolders($fvFolderView) 

Now all we need to do is go thru each one of the folders  with the $findFolderResults. I chose to retrieve the parent folder and the folder id and the type for the folder of ‘IPF.Note’.

function Get-MailBoxfolders
{
  [CmdletBinding()]
  param
  (
    [Parameter(Mandatory=$true, Position=0, HelpMessage='A service that points to exchange instance you wish to query')]
    [Microsoft.Exchange.WebServices.Data.ExchangeService]$Service,
    [Parameter(Mandatory=$true, Position=1, HelpMessage='A mailbox (smtp) that the service has access to')]
    [string]$targetMailbox,
    [string]$Folder2Find
  )
Write-Verbose -Message "create an object that gets the root folder for the mailbox"
  $fvFolderView = new-object Microsoft.Exchange.WebServices.Data.FolderView(1000)
  $fvFolderView.Traversal = [Microsoft.Exchange.WebServices.Data.FolderTraversal]::Deep
  $folderid = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::MsgFolderRoot,$targetMailbox) 

  $tfTargetFolder = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($service,$folderid)

  $findFolderResults = $tfTargetFolder.FindFolders($fvFolderView)

  foreach($folder in $findFolderResults.Folders){
    if($folder.FolderClass -eq 'IPF.Note')
    {
      $parentfolder = ($findFolderResults.Folders |?{$_.id.uniqueid -eq $folder.ParentFolderId.UniqueId}).displayname
      if(-not $parentfolder)
      {$parentfolder = 'Root'}
      if($Folder2Find)
      {
        if($folder.DisplayName -eq $folder2find)
        {
              [pscustomobject] @{
            'name'= $folder.DisplayName
            'folderid' = $folder.Id.UniqueId
            'ParentFolderName' = $parentfolder
            'ParentFolderId' = $folder.ParentFolderId.UniqueId
            'folderclass' = $folder.FolderClass
          }
        }

      }
      else
      {
        [pscustomobject] @{
          'name'= $folder.DisplayName
          'folderid' = $folder.Id.UniqueId
          'ParentFolderName' = $parentfolder
          'ParentFolderId' = $folder.ParentFolderId.UniqueId
          'folderclass' = $folder.FolderClass
        }
    }
  }
  }
  } 

For the full script source see this Gist 

At the end of the script run this is how my folders look in Outlook based on my tagging.

2017-02-09-09_52_52-unread-mail-outlook

I Hope this helps someone.

Until then

Keep Scripting

thom

Advertisements

WMF 5.1 now available

Richard Siddaway's Blog

The download for WMF 5.1 for down level operating systems is now available:

https://blogs.msdn.microsoft.com/powershell/2017/01/19/windows-management-framework-wmf-5-1-released/

WMF 5.1 can be installed on Windows 7 and 8.1 plus Windows Server 2008 R2, 2012, 2012 R2

Windows 10 and Server 2016 already have PowerShell 5.1 and don’t need this install.

if installing on Windows 7 or Server 2008 R2 the installation process has changed – READ THE RELEASE NOTES OR BE PREPARED FOR A LOT OF EXTRA EFFORT

View original post

Copying PowerShell object

Lately I’ve needed to take a PowerShell object and use it in several places in a JSON document that PowerShell nicely put in a custom object for me.  What I needed this object to do was to allow for a set of each one of the properties and they needed to be different for each time I added it to the JSON object.   To get this to work I tried several different means. This post is about how I  worked to solve this issue.

First we’ll start with a customobject that comes from JSON

$tasks2add = $tasks = $null
$taskjson = @'
[
 {
 "taskId": "1",
 "name": "Server-Scommaintenance",
 "enabled": false,
 "inputs": {
 "servers": "$(serverMonitors) ",
 "webMonitors": "$(webMonitors)",
 "MinuteValue": "2000",
 "maintValue": "inMaint"
 }
 },
 {
 "taskId": "2",
 "name": "Server-Scommaintenance",
 "enabled": false,
 "inputs": {
 "servers": "$(serverMonitors) ",
 "webMonitors": "$(webMonitors)",
 "emailusers": "$(ScomNotify)",
 "MinuteValue": "2000",
 "maintValue": "RemoveMaint"
 }
 }
]
'@
$tasks2add = $taskjson|convertfrom-json

Now if  look at my variable $tasks2Add we’ll see that it has all the items in the custom json above:

$tasks2add = $taskjson|convertfrom-json 

PS PS:\> $tasks2add

taskId name                   enabled inputs                                                                                                                       
------ ----                   ------- ------                                                                                                                       
1      Server-Scommaintenance   False @{servers=$(serverMonitors) ; webMonitors=$(webMonitors); MinuteValue=2000; maintValue=inMaint}                              
2      Server-Scommaintenance   False @{servers=$(serverMonitors) ; webMonitors=$(webMonitors); emailusers=$(ScomNotify); MinuteValue=2000; maintValue=RemoveMaint}

Now if I take that same set of objects and add it to another variable and then set each one. Lets see what the output looks like:

$newArraylist = new-object System.Collections.Generic.List[system.object]
$newArraylist.Add((New-object pscustomobject ($tasks2add[0])))
$newArraylist.Add((New-object pscustomobject ($tasks2add[1])))
$newArraylist.Add((New-object pscustomobject ($tasks2add[0])))
$newArraylist.Add((New-object pscustomobject ($tasks2add[1])))
#$newArraylist.count

$newArraylist[0].enabled = $true
$newArraylist[1].enabled = $false
$newArraylist[2].enabled = $false
$newArraylist[3].enabled = $true
$newArraylist

Here is what my output looks like:

2017-01-24-08_26_03-clipboard

You would expect that the first and second tasks would be set to $true and $false respectively as I set them with the $newArraylist[x].enabled = $true / $false.

So what happened here.  PowerShell takes the array object and points (references) the values in the object to the first created object.  So we aren’t really getting a copy we are getting a reference to the first created object.  After much gnashing of teeth and trying several different methods I finally came to a solution that is described in this PowerShell QA post.

To get this to work in the fashion I wanted which is I want each one of the copy’s of the new object to be settable independently I had to use the psobject property of my custom object. I’ll do this with the method called copy on the psobject property.

$tasks2add[1].PSObject.copy()

This makes the code much shorter and solves my issue where I can now set my custom objects like I’d like them to be.

$newArraylist = new-object System.Collections.Generic.List[system.object]
$newArraylist.Add($tasks2add[0].PSObject.Copy())
$newArraylist.Add($tasks2add[1].PSObject.Copy())
$newArraylist.Add($tasks2add[0].PSObject.Copy())
$newArraylist.Add($tasks2add[1].PSObject.Copy())
#$newArraylist.count

$newArraylist[0].enabled = $true
$newArraylist[1].enabled = $false
$newArraylist[2].enabled = $false
$newArraylist[3].enabled = $true
$newArraylist

Now if I look at my object it is now in the condition I want where I can set each item I add to my array list.

2017-01-24 08_41_04-Clipboard.png

I Hope this helps someone.

Until then

Keep Scripting

thom

Deploying a Sharepoint App to Sharepoint Online

This article is about how I was able to use the SharePoint Modules to successfully deploy an application to SharePoint Online.

I’ve used some scripting before to update items in Sharepoint.  This blog article is how I took the build that IOZ tools creates and deploy it to Sharepoint Online.

First I needed to start with downloading the latest copy of SharePointPnPPowerShellOnline.

I discovered while using this module that there is the capability to add a PSDrive to my session.  This means I should be able to upload files to sharepoint as if it were a drive on my local machine.  Here is how you connect to sharpeoint online:

PS> Install-Module -Name SharePointPnPPowerShellOnline
$adminpassword = 'password'
$adminUserName = 'mysharepoint@onmicrosoft.com'
$creds = $AdminPassword | ConvertTo-SecureString -AsPlainText -Force
$SPdevcredentials = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $AdminUserName, $creds
connect-pnponline -Url $url -Credentials $SPdevcredentials -CreateDrive

Now that I have a connection to the SharePoint online instance I can see that I have a new powershell drive through the get-psdrive cmdlet:
get-psdrive

2017-01-10-07_33_05-clipboard

As you can see I have a new drive that is configured for use in my session, directorying the SPO: drive will get the contents of the SharePoint Site:

2017-01-10-07_49_31

My Applications are in my AppCatalog folder to get to that folder all I need to do is issue a CD to that directory.   To upload my App to this folder all I need to do is add it with Add-pnpFile.

2017-01-10 07_56_30.png

I seemed to have the best success when I used Get-Item (gi) and then used the fullname property for the file that I was sending to SharePoint. In addition one other gotcha was that the folder to upload to is a subfolder of the site you are connected to. In my case \sites\apps was my site I was connected to so specifying appcatalog was all I needed.

Now all that I needed to do was to put this in a script that  I could call from my CI automation and put some Error logic.  Now I have a full fledged script called deployspapp.ps1.   Full Source is found on my Gist:

I hope this helped someone

Until then keep scripting

Thom

Checking for valid Url using TryCreate

I’ve been working with Sharepoint online and needed to validate that a user is passing me a valid url.  So after doing some searching i found that [system.uri] had some methods that were useful.  In particular there is a method called TryCreate. This will take a url that you send to the method and let you know if it is a valid uri or not.

So first lets start with a url that we know is valid: Bing.com

TryCreate Expects Three values to be passed to it.

The first is a String or uri as you can see here.

The second is a System.urikind.   As you can see this is an enumeration that has three possible values, Absolute, Relative, and RelativeOrAbsolute

The third is a [ref]erence.  What this means is I must declare a variable for the return result to be put into after the method evaluates what was passed to it.

trycreate

Here is what that looks like in practice. Note I must use [ref] for my return result as it is used as a reference to get the results into.

$url = 'http:\\bing.com'
$kind = 'RelativeOrAbsolute'
$return = $null

[system.uri]::TryCreate($url,$kind,[ref]$return)

Now if I run this I’ll get the following output:

 $kind = 'RelativeOrAbsolute'
$return = $null

[system.uri]::TryCreate($url,$kind,[ref]$return)
True

Now if I look at my return variable we’ll notice that it has a full object with values.

 PS Z:\> $return


AbsolutePath : /
AbsoluteUri : http://bing.com/
LocalPath : /
Authority : bing.com
HostNameType : Dns
IsDefaultPort : True
IsFile : False
IsLoopback : False
PathAndQuery : /
Segments : {/}
IsUnc : False
Host : bing.com
Port : 80
Query : 
Fragment : 
Scheme : http
OriginalString : http:\\bing.com
DnsSafeHost : bing.com
IdnHost : bing.com
IsAbsoluteUri : True
UserEscaped : False
UserInfo :

Now I can use my return value and test to see if the value returned is of type HTTP or HTTPS.

if($return -like 'http*')
{
    write-output 'This is a http or https address'
}

This is a http or https address

You could write several other test’s to figure out what Scheme of uri the user passed. I’ll leave that up to your scripting.  Here is a link to the rest of the uri Schemes

 

 

I hope this helped someone

Until then keep scripting

Thom

Using PowerShell Class to Deploy Zip files

Recently I have been working with @developermj on a class that he wrote for deploying code to a server from a zip file.  This blog article is about how that code works.

To Start this off we need to gain access to the dot net classes that have the features for zipping and unzipping files in them:

System.IO.Compression & System.IO.Compression.FileSystem

These will get added with two statements using and Add-type

#requires -version 5.0
using namespace System.IO
using namespace System.IO.Compression
param(
 [Parameter(Mandatory=$true)][string]$sourceZip, 
 [Parameter(Mandatory=$true)][string]$destPath
)

add-type -assemblyname 'System.IO.Compression'
add-type -assemblyname 'System.IO.Compression.FileSystem'

Then we’ll build the first part of our utility which is our function to deploy the files. This function is where all the magic is:

function Deploy-Files {
 param(
 [ValidateNotNullOrEmpty()][FileInfo]$sourceZip,
 [ValidateNotNullOrEmpty()][DirectoryInfo]$destFolder
 )
 if (-not $sourceZip.Exists) {
 throw "Zip $($sourceZip.Name) does not exist"
 }
 [ZipArchive]$archive = [ZipFile]::Open($sourceZip, "Read")
 [DeployFile[]]$files = $archive.Entries | where-object {$_.Length -gt 0} `
| %{[ArchiveFile]::new($_)}
 if ($files.Length -eq 0) {
 Write-Information "No files to copy"
 }
 $hasWritten = $false
 foreach ($file in $files) {
 [FileInfo]$destFile = "$destFolder$($file.GetName())"
 $copied = $file.TryCopy($destFile)
 if ($copied) { $hasWritten = $true }
 }
 Write-Information "Done"
 if (-not $hasWritten) {
 Write-Information "...Nothing copied"
 }
}

Since the incoming object is of type Fileinfo we can find out if the file exists with this statement: if (-not $sourceZip.Exists) . If the sourcezip exists then we progress on through our function. Else we throw an exception.

Since we’ve imported the dot net classes for filecompression we now have an available type we can cast our $archive variable to [ZipArchive]. Since ZipArchive requires a stream we can open the zip file with the ZipFile class and stream it to the ZipArchive object.

Now that we have the entire contents for the archive in a variable $archive we can use apply our class to the variable.  Below is what the value of my $archive looks like.

[DBG]: PS ps:\>> $archive

Entries Mode
------- ----
{Code/, Code/Lib/, Code/Lib/ICSharpCode.SharpZipLib.dll, Code/Mindscape.Samples.Powershell.ZipProvider.csproj...} Read

[DBG]: PS ps:\>> $archive.entries.count
11

The next line in the code is where we’ll start using the Class we’ve defined in our script.

[DeployFile[]]$files = $archive.Entries `
| where-object {$_.Length -gt 0} | %{[ArchiveFile]::new($_)}

Since we are creating a new object of type [deployFile[]] Powershell will see this and instantiate a new object from our Class.  In the example above we are taking each archive entry and creating a new [ArchiveFile]. If we follow the code through this loop we’ll find the first data element that’s length is greater than 0 will be defined as a [Archivefile].

class ArchiveFile : DeployFile {
 hidden [ZipArchiveEntry]$entry

 ArchiveFile([ZipArchiveEntry]$entry) {
 $this.entry = $entry
 }

 [DateTime] GetModifiedDate() {
 return $this.entry.LastWriteTime.UtcDateTime
 }

 [void] Copy([FileInfo]$file) {
 [ZipFileExtensions]::ExtractToFile($this.entry, $file.FullName, $true)
 }

 [string] GetName() {
 return "\$($this.entry.FullName)"
 }
}

As you can see from the declaration for this class [ArchiveFile] inherits the [DeployFile] class. PowerShell will hit the constructor that matches what was passed to the class.   We passed a [ZipArchiveEntry]

Since this is now defined a new object it inherits all the methods that are declared in the class for this object type.  This object type has The following methods defined:

GetModifiedDate, Copy, GetName

It then inherits from the [DeployFile] from this inheritance it gets the following methods:

ShouldCopy, Copy, TryCopy, ToString

ArchiveFile([ZipArchiveEntry]$entry) {
 $this.entry = $entry
 }

If we continue to loop through each item in our intial $archive variable we’ll notice that we end up with a new Variable of type [DeployFile]. This $files variable is now of that type if we pipe the variable to get member we’ll see that we have a class name of [ArchiveFile]. if we look at the members of the $files of the array we’ll see the [Archivefile] class and the methods that were inherited from the other class [DeployFiles].

DBG]: PS ps:\>> $files[0] | gm

 TypeName: ArchiveFile

Name MemberType Definition 
---- ---------- ---------- 
Copy Method void Copy(System.IO.FileInfo file) 
Equals Method bool Equals(System.Object obj) 
GetHashCode Method int GetHashCode() 
GetModifiedDate Method datetime GetModifiedDate() 
GetName Method string GetName() 
GetType Method type GetType() 
ShouldCopy Method bool ShouldCopy(System.IO.FileInfo file)
ToString Method string ToString() 
TryCopy Method bool TryCopy(System.IO.FileInfo file)

Now that we have our class we can move onto deploying these files to the intended target. Which is what this next line of code does.

 foreach ($file in $files) {
 [FileInfo]$destFile = "$destFolder$($file.GetName())"
 $copied = $file.TryCopy($destFile)
 if ($copied) { $hasWritten = $true }
 }

The Foreach loops through each file and gets the destingation location plus the name of the file by calling the classes method getname().

 [DBG]: PS ps:\>> $file
\Code/Lib/ICSharpCode.SharpZipLib.dll

[DBG]: PS ps:\>> $file.getname()
\Code/Lib/ICSharpCode.SharpZipLib.dll

Now that we have a [Fileinfo] object we can now call the TryCopy method on our $file.  TryCopy Expects a type of [Fileinfo]

$copied = $file.TryCopy($destFile)

Which takes us to our class for file into it’s method TryCopy

  [bool] TryCopy([FileInfo]$file) {
 if ($this.ShouldCopy($file)) {
 [DeployFile]::CreateFolderIfNeeded($file)
 Write-Verbose "Copying to $($file.Name)"
 $this.Copy($file)
 return $true
 }

The first this is we are going to test to see if we should copy this file with the should Copy method on the same object ($this).

  [bool] ShouldCopy([FileInfo]$file) {
 if (-not $file.Exists) {
 return $true
 }

 if ($this.GetModifiedDate() -gt $file.LastWriteTimeUtc) {
 return $true
 }

 return $false
 }

This function will check to see if the file doesn’t exist with -not $file.exists. Then it checks to see what the modified date is.  if the Modified date is greater than the files last writetime in UTC. Then we are going to return true.  Which means that this file is newer and should be copied. Hence the function name should copy.  If both those tests fail then we’ll return false because the file exists and its timestamp is less than the lastwritetimeutc.

Now we return back to the TryCopy. Provided the return results of the try copy is true we’ll next check to see if we need to create a directory through a call to the class [DeployFile]::CreateFolderifNeeded([fileinfo]). This function is part of the deployfile class and will create a folder if it isn’t present for the file in question.

Now that the folder is created.  We can now call the copy function from the $file object.

This will copy the file to the destination filename based on the $file object.

Note:

I haven’t been able to get this script to run on it’s own without writing a wrapper script to then call this one.  I’ve posted an article about this on Powershell.org.

https://powershell.org/forums/topic/system-io-compression-in-powershell-class/

Here is what I have in my wrapper Script:

#requires -version 5.0
using namespace System.IO
using namespace System.IO.Compression
param(
 [Parameter(Mandatory=$true)][string]$sourceZip, 
 [Parameter(Mandatory=$true)][string]$destPath
)
add-type -assemblyname 'System.IO.Compression'
add-type -assemblyname 'System.IO.Compression.FileSystem'
& .\copy-code.ps1 -sourceZip $sourceZip -destpath $destpath

 

 

I hope this helps someone.

Until then keep scripting

Thom

Full copy of the script is in this Gist:

Updating Azure Alert Email

We have a number of Email’s setup for alerting that need to be changed. Rather than go to each alert and update their properties I chose to update each available alert in my subscriptions using PowerShell.  This post is about how I did that.

I will assume for the purposes of this post that you already are aware of the means to connect to Azure. If you aren’t familiar with that process see the article posted here.

The first thing I needed to figure out is how do I get my already configured alerts.  I chose to use the Cmdlet Get-AzureRmResource.  I then took the results of my query to find all the alerts in the current subscription context:

$alerts = get-AzureRmResource `
 | Where-Object{$_.resourcetype -like '*alert*'}

Now that I have all my resources that look like an alert I can now iterate through each and find the properties of each alert Get-AzureRmAlertRule:

foreach($alert in $alerts)
get-azureRmalertRule -Resourcegroup `
$alert.ResourceGroupName -Name $alert.Name
}
Properties : Microsoft.Azure.Management.Insights.Models.Rule
Tags : {[$type,
 Microsoft.WindowsAzure.Management.Common.Storage.CasePreservedDictionary,
 Microsoft.WindowsAzure.Management.Common.Storage], [hidden-link:/subscripti
 ons/xxx/resourceGroups/AzureTesting/provid
 ers/Microsoft.Web/serverfarms/EasyAuth, Resource]}
Id : /subscriptions/xxxx/resourceGroups/AzureTes
 ting/providers/microsoft.insights/alertrules/LongHttpQueue
Location : East US
Name : LongHttpQueue 

After some testing of this particular function I discovered that the extra switch of -DetailedOutput provided the detail I was looking for.

foreach($alert in $alerts)
get-azureRmalertRule -Resourcegroup `
$alert.ResourceGroupName -Name $alert.Name
}
Properties :
 Name: : LongHttpQueue EasyAuth
 Condition :
 DataSource :
 MetricName : HttpQueueLength
 ResourceId : /subscriptions/xxxxxxxx-xxxxxx-xxxxx-xxxxx-xxxxxxxxxx/re
 sourceGroups/AzureTesting/providers/Microsoft.Web/serverfarms/EasyAuth
 Operator : GreaterThan
 Threshold : 100
 Aggregation operator: Total
 Window size : 00:05:00
 Description : The HTTP queue for the instances of EasyAuth has a
 large number of pending requests.
 Status : Disabled
 Actions :
 SendToServiceOwners : True
 E-mails : 

Tags :
 $type :
 Microsoft.WindowsAzure.Management.Common.Storage.CasePreservedDictionary,
 Microsoft.WindowsAzure.Management.Common.Storage
 hidden-link:/subscriptions/xxxxxxxx-xxxxxx-xxxxx-xxxxx-xxxxxxxxxx/resourceGro
 ups/AzureTesting/providers/Microsoft.Web/serverfarms/EasyAuth:
 Resource
Id : /subscriptions/xxxxxxxx-xxxxxx-xxxxx-xxxxx-xxxxxxxxxx/resourceGroups/AzureTes
 ting/providers/microsoft.insights/alertrules/LongHttpQueue EasyAuth
Location : East US
Name : LongHttpQueue EasyAuth

Now I need to find out what the Email property was for this object I retrieved from the Get-AzureRmAlertRule.   If I inspect the object a little closer I find that there is a  sub Object called properties and then under that object I find another object that my Emails are associated to.   What I discovered through trial and error was that the Actions property was an array of settings.  The first item if set is the customEmails and whether or not an email should be sent upon alert activation (shown below).

PS PS:\azure> $t = get-azureRmalertRule -Resourcegroup `
'Azure Testing' -Name 'LongHttpQueue EasyAuth'
PS PS:\azure> $t.properties.Actions[0]

CustomEmails SendToServiceOwners
------------ -------------------
{} True

So this means if there are no emails set then the Array Count is Zero.  The other item that happens to be in the Action Object is whether or not a WebHook is set or not.  This can be seen by looking at the serviceuri in the actions object as shown below:

PS PS:\azure> $t =(get-azurermalertrule -name 'CPUHigh Dev' `
 -resourcegroup Dev -DetailedOutput)

PS PS:\azure> $t.properties.Actions | fl

Properties : {[$type, Microsoft.WindowsAzure.Management.Common.Storage.CasePreservedDict
 ionary`1[[System.String, mscorlib]],
 Microsoft.WindowsAzure.Management.Common.Storage]}
ServiceUri : https://s1events.azure-automation.net/webhooks?token=xxxx

CustomEmails : {email@email.com, email2@email.com}
SendToServiceOwners : True

On to how to change the email.  According to the blog article from Microsoft, you can only delete or add alert rules. I found this to be partially true.  In that if I already have an alert I can update it by just calling Add-AzurermMetricAlertRule.

Now to add email Items to the Add-AzurermMetricAlertRule you can do it two different ways:

The first way is use the Cmdlet Microsoft provides which creates an object of the precise thing you want and in the format the Add-AzurermMetricAlertRule expects:

$email = 'youremail@youremailServer.com'
$newEmailObj = new-azurermAlertRuleEmail -CustomEmails $email
add-azurermmetricalertrule -name $Name `
 -Location $Location -ResourceGroup $ResourceGroupName `
-operator ($alert.operator) -Threshold ($alert.threshold)`
 -TargetResourceId $alert.DataSource.ResourceUri`
 -MetricName $alert.DataSource.MetricName`
 -WindowSize $alert.WindowsSize`
 -TimeAggregationOperator $alert.TimeAggregation`
 -Description $targetResourceId.properties.Description`
 -Actions $newEmailObj

Or the other way you can do it is when you have the return result of alert already in an object you can use the .Add of the object to add an email to it.

$email = 'youremail@youremailServer.com'
$targetResourceId = (get-azurermalertrule -ResourceGroup `
$ResourceGroupName -Name $Name -DetailedOutput)
$actions = $targetResourceId.properties.Actions
 if($actions.count -eq 0)
 {
 $targetresourceId.properties.actions.add((`
new-azurermAlertRuleEmail -CustomEmails $email ))
 $targetresourceid.properties.actions`
[($targetresourceid.properties.actions.count -1)].SendToServiceOwners = $true
 $addedEmail = $true
 }
 else
 {
 $emailActions = $targetResourceId.properties.Actions.Count -1
 $emails = $actions[$emailActions].customemails
 if($emails -notcontains $email)
 {
 $targetResourceId.properties.actions[$emailActions].customemails.add($email)
 $addedEmail = $true
 }
 }

I chose to use the .add method as I’m doing this over and over again and it was to my advantage to use that method. Only when I have a case of there not being an alert ($actions.count -eq 0) do I use the New-AzureRmAlertRuleEmail.

I assume if there isn’t at least one item in $actions then it’s safe to add the email.

$emailActions = $targetResourceId.properties.Actions.Count -1
 $emails = $actions[$emailActions].customemails

I use $addedEmail to tell my function whether or not I need to add the email. This is because the the function will run these steps in a ForEach loop.

Now that I have a means to get the alert email and update it doing the converse is a matter of  changing the .Add method to a .Remove method and Bingo I have a add and a delete.  To see the entire script in action see this Gist. PS. I’m still working on the help. Will update the GIST as it is updated:

I hope this helps someone out.

Until then keep scripting

thom

Deploying CRM with TFS 2015 tasks and PowerShell

Recently I was asked to put together some automation that would Deploy’s a  CRM solution with Solution Packager.  This Blog post is about how I did that.

I started with the documentation on Solution Packager.   What I found was that I could write a simple script that takes the source location for the Solution and packages it into a zip.  Here is what that run line looks like:

.\SolutionPackager.exe /action:Pack  `
/folder:'C:\CRM\CRM Solutions\default' /zipfile:c:\temp\myzip2.zip  `
/packagetype:unmanaged

So now to put a try catch around it and some other House keeping.  Since I’m going to call this from TFS I need to make certain that I have a way to allow for -debug and the other standard switches with a Powershell script so I’ll include [CmdletBinding()]. The full script is below:

[CmdletBinding()]
param
(
 [String]
 [Parameter(Mandatory)]
 $SourceFolder,
 [String]
 $zipfile = 'Crm'
)
$ErrorActionPreference = 'Stop'
$sourcefolder = (Get-Item $SourceFolder).FullName
"ZipfFileName: $sourcefolder\$zipfile.zip"
 Try
 {
 & .\SolutionPackager.exe /action:Pack /folder:$SourceFolder `
"/zipfile:$sourcefolder\$zipfile.zip" /packagetype:unmanaged 
 }
 catch
 { 
 Write-Error $_.Exception.Message
 exit 1
 }
 Get-ChildItem -Path "$sourcefolder\$zipfile.zip" -Verbose

Now that I have the Solution zipped up from what the developer checks into source I need a means to deploy it.   The Powershell Developer in me wanted to write a script of my own, I found that someone had already written this capability and all I needed to do was to add it to TFS.    Here is what I found:

A Developer by the name of Chaminda Chandrasekara created a Plugin to TFS (task) that does a solution import and activation of workflows.

Now with that in mind I added Chaminda’s code to my release process in TFS and then added the script that I created to my build process in TFS for the full solution.   I did need to create a task for my script shown above.  This was done by following the steps found here.

2016-11-03-09_28_25-microsoft-team-foundation-server

My build process steps consist of two steps.

Step 1 is to create the package. 2016-11-03-09_31_18-microsoft-team-foundation-server

Step 2 is to copy the artifacts to a Staging directory

2016-11-03-09_29_54-microsoft-team-foundation-server

Now onto the release process which also consists of two steps.

Step 1 is to do the solution import:

2016-11-03-09_41_05-release

In this setup I specify the name of the Zip file from the earlier build.  I have TFS variables that are defined in my Release steps identified by:

$(CRMUrl),$(CRMOrg), $(CRMSolution) etc.

To see how these are implemented this site has a good write-up.

Step 2 the last step is to publish the workflow:

2016-11-03-09_45_42-crmservices-release-visual-studio-team-services

That does it except for all the rest of the setup work you must do to allow it to push through all your environments.

I hope this helps someone

 

Until then keep scripting

thom

Quick hit – Server and Site Ips

Recently I needed to quickly get server IPS and Website addresses.  So I put together a couple line script to do this.  This post is about how that works:

Since I need to get these remotely I’m going to use a PowerShell Session . So I’ll first create my variable to hold the servers i need:

$servers = 'server1','server2','server3','server4'

Now that i have a variable with servers in it I can send this to new-pssesion and use the variable I setup above.

$session = new-pssession -computername $servers

My $sesion variable contains a session to each server that I want to Invoke-Command on.  So now I need to come up with a way to get the IPS for every site I have setup in IIS and all the ips for the server I’m calling. If I do this locally I can use this script

import-module webadministration ;(Get-Website) | Select-Object name`
 ,@{Name='bindings' expression= `
{($_.bindings.collection.bindingInformation -replace ':\d+','').trim(':') }}}

I’m taking the values from Get-website  and selecting the name bindings and alson the bindings.collection binding information and removing the Port from the binding number. Return results look like this:

name : MySite
bindings : 10.10.10.39

Now all I need is to find out how to get my ip address. I can do this by using the command Get-Netipaddress if I enclose the function I can get at the property:

PS PS:\iis> (get-netipaddress).IPAddress
::1
10.10.10.1
127.0.0.1

Now to put it together in a single script:

 $servers = 'server1','server2','server3','server4'
 $sess = new-pssession -ComputerName $servers -Credential $admCredentials
 invoke-command -Session $sess {import-module webadministration ;(Get-Website) | Select-Object name ,@{Name='bindings' 
 expression={($_.bindings.collection.bindingInformation -replace ':\d+','').trim(':') }}}
 invoke-command -session $sess { $env:computername;(get-netipaddress).IPAddress}

name : website1
bindings : 10.10.10.48
PSComputerName : server1
RunspaceId : 91f9523f-58df-49d5-a0b1-064101822aae

name : websiteapi
bindings : 10.10.10.49
PSComputerName : server1
RunspaceId : 91f9523f-58df-49d5-a0b1-064101822aae

name : website1
bindings : 10.10.10.50
PSComputerName : server2
RunspaceId : 3cd44533-d827-4a60-bdcc-6c91e77d96b9

name : websiteapi
bindings : 10.10.10.51
PSComputerName : server2
RunspaceId : a90c36ca-630e-42a8-abcd-069e8cec5360

server1
server2
::1
::1
::1
10.10.10.48
10.10.10.49
10.10.10.40
10.10.10.51

I hope this helps someone out. .

 

 

Until then keep Scripting

 

thom

Create SSRS Data Source

Continuing on with my earlier discussion on SSRS and testing data sources.  I’ve come up with a means to create a data source.  This article will demonstrate how I did that.

In testing the data source I had to create a proxy to the SSRS server again we’ll need to do the same thing so we can get to the Create method for the Data source.

$reportWebService = 'http://yourReportServer/ReportServer/ReportService2010.asmx'
$credentials = Get-Credential
$reportproxy = New-WebServiceProxy -uri $reportWebService -Credential $credentials

The reportWebService is a link to my Webservice on my ssrs instance which when proxied will allow me to get at all the methods and properties of this class Reportservice2010.asmx

The method we’ll be using for this discussion is ReportingService2010.CreateDataSource.

This method requires three variables.

[string] DataSource,
[string] Parent,
[boolean]Overwrite,
[DataSourceDefinition] Definition,
[Property[] ]Properties

The Datasource is a String = The name for the data source including the file name and, in SharePoint mode, the extension (.rsds).

Parent = The fully qualified URL for the parent folder that will contain the data source.  In My case I’m going to use /ThomTest

Where the location from root on the SSRS server is the folder Named ThomTest.

Overwrite = This tells the function if it finds it to overwrite what is there.

DataSourceDefition = This is a DataSourceDefinition class that contains the values for the DataSource. This includes things like:

ConnectStringCredentialRetrieval, Enabled, EnabledSpecified ImpersonateUserImpersonateUserSpecifiedPasswordPrompt, UserName, WindowsCredentials

For each of the above properties here is what I’ve been able to discover so far for where they are used:

2016-08-03 16_08_38-Clipboard

[Property[] ]Properties =  ReportService2010.Property[]– an array of properties that are nearly the same thing as the data source definition. So some of the same data collected to create the data source definition is used in this property Array collection.

The tough part of this creation of the datasource was getting the values passed into the PowerShell function to be accepted by the proxied method.  In order to do this I stumbled on this great article on StackOverflow. This allowed me to get at the classes from the proxied webservice via calls similar to the one below:

$ssrsproxy = New-SSRSProxy -reportWebService $reportWebService `
-Credentials $credentials
 $proxyNameSpace = $ssrsproxy.gettype().Namespace

So in order to get to the class I need for the DataSourceDefinition .  All i need to do is take the ProxyName space and append it to the proxied name space.

$proxyNameSpace = $ssrsproxy.gettype().Namespace 
$datasourceDef = New-Object("$proxyNameSpace.DataSourceDefinition")

Now my $datasourceDef is a DatasourceDefinition object which contains the properties I showed above.  Since it is now in an object all I need to do now to set the items I need is to refer to them via . notation:

$datasourceDef = New-Object("$proxyNameSpace.DataSourceDefinition")
PS PS:\> $datasourceDef


Extension : 
ConnectString : 
UseOriginalConnectString : False
OriginalConnectStringExpressionBased : False
CredentialRetrieval : Prompt
WindowsCredentials : False
ImpersonateUser : False
ImpersonateUserSpecified : False
Prompt : 
UserName : 
Password : 
Enabled : False
EnabledSpecified : False

PS PS:\> $datasourcedef.Connectstring = 'MyConnectionSTring'

Ok now the fourth parameter is the tough one this is where I had to get help from @Poshoholic on how to get a hashtable for the values into a Array of properties that the create will accept.

Here is what the Hashtable looks like:

 PS PS:\> $datasourceDefHash = @{
 'ConnectString' = $connectString; 'UserName' = $username; `
'Password' = $password; 'WindowsCredentials' = $windowsCredentials; `
'Enabled' = $enabled; 'Extension' = $Extension; `
'ImpersonateUser' = $ImpersonateUser; 'ImpersonateUserSpecified' = $true; `
'CredentialRetrieval' = $credentialRetrieval
 }

My understanding of what is needed is a property Collection so I named my variable a property collection:

 $propertyCollection = $datasourceDefHash.Keys.foreach`
{ @{ Name = $_; Value = $dataSourceDefHash[$_] }`
 -as "${proxyNamespace}.property" }

The magic here is where we are iterating through our keys and then casting each name and value to the $proxynamespace.property which is our ReportService2010.Property[] array.  @Poshoholic informed that because the name of the class is dynamic we have to use the -as key word to allow it to be ‘cast’ into the property we need.  Wow I’m glad he helped me or I’d have been here a very long time.

Now to put it all together. I originally wrote this function to all for continuous deployments and creation of data sources. The only value I really wanted to use was the Storing of the username and password (Credentials stored securely in the report server). In addition I need the checkbox for this option checked ( Use as Windows credentials when connecting to the data source).  with the Username and password entered upon calling the function.

So here is what my param block looks like:

  param
 (
 [Parameter(Mandatory = $false)]
 [string]$DataSourceName,
 [string]$path,
 [Parameter(Mandatory = $false)]
 [uri]$reportWebService,
 [string]$connectString,
 [string]$password,
 [string]$username,
 [ValidateSet('SQL','SQLAZURE','OLEDB','OLEDB-MD','ORACLE','ODBC','XML',`
'SHAREPOINTLIST','SAPBW','ESSBASE','Report Server FileShare','NULL'`
,'WORDOPENXML','WORD','IMAGE','RPL','EXCELOPENXML','EXCEL','MHTML',`
'HTML4.0','RGDI','PDF','ATOM','CSV','NULL','XML')]
 [string]$Extension = 'SQL',
 [boolean]$windowsCredentials = $false,
 [boolean]$enabled = $true,
 [boolean]$ImpersonateUser = $false ,
 [ValidateSet('None', 'Prompt', 'Integrated', 'Store')]
 [string]$credentialRetrieval = 'Store',
 [System.Management.Automation.PSCredential]$credentials
 )

Now that I have my user passing in their credentials and the items I need with the default values I can now call some of the methods and Items I described above:

  #https://msdn.microsoft.com/en-us/library/reportservice2010.reportingservice2010.createdatasource.aspx
 $ssrsproxy = New-SSRSProxy -reportWebService $reportWebService -Credentials $credentials
 $proxyNameSpace = $ssrsproxy.gettype().Namespace
 #https://msdn.microsoft.com/en-us/library/reportservice2010.datasourcedefinition.aspx
 $datasourceDef = New-Object("$proxyNameSpace.DataSourceDefinition") #definition is needed because the create expects and object with some of the properties set.
 #$dataSourceProps = New-Object ("$proxyNameSpace.property")
 #$ssrsExtensions = ($ssrsproxy.ListExtensions('All')).name `
 #-join "','" for creating the set statement for extensions.
 #for some reason you have to set the extension and datasouce `
 in the definition before attempting to create. 
 $datasourceDef.connectstring = $connectString
 $datasourcedef.Extension = $Extension
 if ($credentialRetrieval -eq 'Store')
 {
 $datasourceDef.WindowsCredentials = $WindowsCredentials
 $datasourceDef.password = $password
 $datasourceDef.CredentialRetrieval = $credentialRetrieval
 $datasourceDef.username = $username
 }
 $datasourceDefHash = @{
 'ConnectString' = $connectString; 'UserName' = $username; 'Password' = $password; 'WindowsCredentials' = $windowsCredentials; 'Enabled' = $enabled; 'Extension' = $Extension; 'ImpersonateUser' = $ImpersonateUser; 'ImpersonateUserSpecified' = $true; 'CredentialRetrieval' = $credentialRetrieval
 }
 #convert the hashtable to an array of proxynamespace property items. https://msdn.microsoft.com/en-us/library/reportservice2010.property.aspx
 $propertyCollection = $datasourceDefHash.Keys.foreach`
{ @{ Name = $_; Value = $dataSourceDefHash[$_] } -as "${proxyNamespace}.property" }
 try
 {
 $ssrsproxy.CreateDataSource($DataSourceName, $path, $true, $datasourceDef, $propertyCollection)
 }
 catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }

The actual piece that is doing the creation of the data source is this line

$ssrsproxy.CreateDataSource($DataSourceName, $path, $true, $datasourceDef, `
$propertyCollection)

The script in its full form is below:

<#
 .SYNOPSIS
 Creates an SSRS data source
 
 .DESCRIPTION
 This script creates a datasource from the PowerShell prompt.
 
 .PARAMETER DataSourceName
 A description of the DataSourceName parameter.
 
 .PARAMETER path
 Path to where the datasource will be created. This should be the root of where the source is created.
 /report/report data source will be created at the second report value.
 
 .PARAMETER reportWebService
 URI to the location of the reportingService 2010 asmx page.
 
 .PARAMETER connectString
 This is the connection string that you use to connect to your database.
 
 .PARAMETER password
 Password to use if you are storing the credentials on the SQL server.
 
 .PARAMETER UserName
 Username to use for the connection if you are storing the credentiasl on the SQL Server.
 
 .PARAMETER Extension
 The Extension parameter is described as the Data Source Type in the new data source window in SSRS. Depending on your installation you may or may not have the items specified in the set statement for this function:
 'SQL' = SQL Server Connection
 'SQLAZURE' = SQL Azure Connection
 'OLEDB' = OLEDB connection 
 other possible connections include: 'OLEDB-MD','ORACLE','ODBC','XML','SHAREPOINTLIST','SAPBW','ESSBASE','Report Server FileShare','NULL','WORDOPENXML','WORD','IMAGE','RPL','EXCELOPENXML','EXCEL','MHTML','HTML4.0','RGDI','PDF','ATOM','CSV','NULL','XML'
 
 .PARAMETER windowsCredentials
 windowsCredentials = When using 'Store' with credential retrieval this sets the data source to 'Use as Windows credentials when connecting to the data source' 
 
 .PARAMETER enabled
 This Tells SSRS to enable the data source.
 
 .PARAMETER ImpersonateUser
 SEt this to true if you want to use the 'Impersonate the authenticated user after a connection has been made to the data source'.
 
 .PARAMETER credentialRetrieval
 CredentialRetrieval = one of four values:
 None = Credentials are not required
 Store = Credentials stored securely in the report server
 requires setting the username and password and optional params are impersonate and windowsCredentials
 Prompt = Credentials supplied by the user running the report
 Integrated = Windows integrated security
 
 .PARAMETER Credentials
 The credentials parameter is required to access the web service. They should be [System.Management.Automation.PSCredential] type
 
 .PARAMETER WebService
 This is the url to the Webservice which allows for creation of 
 
 .EXAMPLE
 PS C:\> $reportWebService = 'http://mySSRSServer//reportserver/reportservice2010.asmx'
 PS C:\> New-SSRSDataSource -DataSourceName 'ThomTest' -path '/ThomTest' -reportWebService $ReportWebService -connectString 'Data Source=servername;Initial Catalog=DB;Integrated Security=True' -username 'domain\user' -password 'password' -Extension SQL -enabled $true -windowsCredentials $true -credentialRetrieval Store -impersonateuser $true -credentials $credentials
 
 .NOTES
 Additional information about the function.
#>
function New-SSRSDataSource
{
 [CmdletBinding()]
 param
 (
 [Parameter(Mandatory = $false)]
 [string]$DataSourceName,
 [string]$path,
 [Parameter(Mandatory = $false)]
 [uri]$reportWebService,
 [string]$connectString,
 [string]$password,
 [string]$username,
 [ValidateSet('SQL','SQLAZURE','OLEDB','OLEDB-MD','ORACLE','ODBC','XML','SHAREPOINTLIST','SAPBW','ESSBASE','Report Server FileShare','NULL','WORDOPENXML','WORD','IMAGE','RPL','EXCELOPENXML','EXCEL','MHTML','HTML4.0','RGDI','PDF','ATOM','CSV','NULL','XML')]
 [string]$Extension = 'SQL',
 [boolean]$windowsCredentials = $false,
 [boolean]$enabled = $true,
 [boolean]$ImpersonateUser = $false ,
 [ValidateSet('None', 'Prompt', 'Integrated', 'Store')]
 [string]$credentialRetrieval = 'Store',
 [System.Management.Automation.PSCredential]$credentials
 )
 #https://msdn.microsoft.com/en-us/library/reportservice2010.reportingservice2010.createdatasource.aspx
 $ssrsproxy = New-SSRSProxy -reportWebService $reportWebService -Credentials $credentials
 $proxyNameSpace = $ssrsproxy.gettype().Namespace
 #https://msdn.microsoft.com/en-us/library/reportservice2010.datasourcedefinition.aspx
 $datasourceDef = New-Object("$proxyNameSpace.DataSourceDefinition") #definition is needed because the create expects and object with some of the properties set.
 #$dataSourceProps = New-Object ("$proxyNameSpace.property")
 #$ssrsExtensions = ($ssrsproxy.ListExtensions('All')).name #-join "','" for creating the set statement for extensions.
 #for some reason you have to set the extension and datasouce in the definition before attempting to create. 
 $datasourceDef.connectstring = $connectString
 $datasourcedef.Extension = $Extension
 if ($credentialRetrieval -eq 'Store')
 {
 $datasourceDef.WindowsCredentials = $WindowsCredentials
 $datasourceDef.password = $password
 $datasourceDef.CredentialRetrieval = $credentialRetrieval
 $datasourceDef.username = $username
 }
 $datasourceDefHash = @{
 'ConnectString' = $connectString; 'UserName' = $username; 'Password' = $password; 'WindowsCredentials' = $windowsCredentials; 'Enabled' = $enabled; 'Extension' = $Extension; 'ImpersonateUser' = $ImpersonateUser; 'ImpersonateUserSpecified' = $true; 'CredentialRetrieval' = $credentialRetrieval
 }
 #convert the hashtable to an array of proxynamespace property items. https://msdn.microsoft.com/en-us/library/reportservice2010.property.aspx
 $propertyCollection = $datasourceDefHash.Keys.foreach{ @{ Name = $_; Value = $dataSourceDefHash[$_] } -as "${proxyNamespace}.property" }
 try
 {
 $ssrsproxy.CreateDataSource($DataSourceName, $path, $true, $datasourceDef, $propertyCollection)
 }
 catch
 {
 "Error was $_"
 $line = $_.InvocationInfo.ScriptLineNumber
 "Error was in Line $line"
 }

}

function New-SSRSProxy
{
 param
 (
 [string]$reportWebService,
 [Parameter(Mandatory = $true,
 ValueFromPipeline = $true,
 ValueFromPipelineByPropertyName = $true)]
 [System.Management.Automation.PSCredential]$Credentials
 )
 Begin
 {
 if ($reportWebService -notmatch 'asmx')
 {
 $reportWebService = "$reportWebService/ReportService2010.asmx?WSDL"
 #$reportWebServiceurl = $reportWebServiceUrl.Replace("//","/")
 }
 }
 Process
 {
 #Create Proxy
 Write-Verbose "Creating Proxy, connecting to : $reportWebService"
 $ssrsProxy = New-WebServiceProxy -Uri $reportWebService -UseDefaultCredential -ErrorAction 0
 #Test that we're connected
 $members = $ssrsProxy | get-member -ErrorAction 0
 if (!($members))
 {
 if (!$Credentials)
 {
 $Credentials = Get-Credential -Message 'Enter credentials for the SSRS web service'
 }
 Else
 {
 }
 $ssrsProxy = New-WebServiceProxy -Uri $reportWebService -Credential $Credentials
 }
 $ssrsProxy
 }
 End { }
}

I hope this helps someone

Until then keep scripting

 

Thom