This is a quick one, it’s been forever since I’ve posted here. After moving back to Autotask, there’s still a ton of things to automate. One of the things that was bugging me was the fact you can’t set the client portal to default. Well, here’s a script you can run periodically to enable all the users to have the simple version of the client portal.

There is a few places you might want to update. I use a filter for a specific customer category, so where your $companies variable is, you may want to change or remove this part:

{“op”:”eq”,”field”:”companyCategoryID”,”value”:”101″}

Here’s the script, currently only handles companies with 500 or less contacts, I’ll update this with a loop to get all the contacts as well.

Function New-SecurePassword {
    $Password = "!?@#$%^&*0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ_abcdefghijklmnopqrstuvwxyz".tochararray()
($Password | Get-Random -Count 10) -Join ''
}

$at_uri = $($env:at_uri)
$at_integrationcode = $($env:at_integrationcode)
$at_username = $($env:at_username)
$at_secret = $($env:at_secret)

###AT HEADERS###
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("ApiIntegrationcode", "$at_integrationcode")
$headers.Add("Content-Type", 'application/json')
$headers.Add("UserName", "$at_username")
$headers.Add("Secret", "$at_secret")

$companies = $(Invoke-RestMethod -uri $($at_uri + '/v1.0/Companies/query?search={"IncludeFields": ["id", "companyName","companyNumber","isActive"],"filter":[{"op":"eq","field":"companyCategoryID","value":"101"}]}') -Headers $headers -Method Get).items

foreach ($company in $companies | Select-Object -skip 3) { 
    $contacts = $(Invoke-RestMethod -uri $($at_uri + '/v1.0/Contacts/query?search={"IncludeFields": ["id", "firstName","lastName","isActive","emailAddress"],"filter":[{"op":"and","items":[{"op":"eq","field":"companyID","value":"' + $($company.id) + '"},{"op":"eq","field":"isActive","value":"true"}]}]}') -Headers $headers -Method Get).items
    $query = $null;
    $x = 0; $y = 0;
    $clientportal = @();
    do {
        foreach ($contact in $contacts) {
            if ($query) { 
                $query += ',{"op":"eq","field":"contactID","value":"' + $($contact.id) + '"}'
            }
            if (!$query) { 
                $query = '{"op":"eq","field":"contactID","value":"' + $($contact.id) + '"}'
            }
            $y++; $x++;
            if ($x -eq $contacts.Count) { $y = 100 }
            if ($y -eq 100) { 
                $postbody = '{"filter":[{"op":"or","items":[' + $query + ']}]}'
                $clientportal += $(Invoke-RestMethod -uri $($at_uri + '/v1.0/ClientPortalUsers/query') -Body $postbody -Headers $headers -Method Post).items
                $query = $null; $y = 0;
            }
        }
    }
    while ($x -lt $contacts.count)
    if ($clientportal.count -ne $contacts.count) { 
        write-host "Contacts: $($contacts.count)"
        write-host "Enabled: $($clientportal.count)"
        $missing = $null;
        $missing = $contacts.id | Where-Object { $_ -notin $clientportal.contactId }
        foreach ($miss in $missing) { 
            $contact = $null;
            $contact = $contacts | Where-Object { $miss -eq $_.id }
            write-host "$($contact.emailAddress)"
            $json = [PSObject]@{
                contactID            = $($contact.id)
                userName             = "$($contact.emailAddress)"
                securityLevel        = 1
                password             = "$(new-securepassword)"
                numberFormat         = 22
                dateFormat           = 1
                timeFormat           = 1
                isClientPortalActive = $true
            }
            $json = $json | ConvertTo-Json
            Start-Sleep -Milliseconds 10
            Invoke-RestMethod -uri $($at_uri + '/v1.0/ClientPortalUsers') -Method POST -Body $json -Headers $headers
        }
    }
}

Simple command turned crazy. I ended up coming up with this due to the fact we have duplicate display names and needed to update for Exchange Online to get mailbox sizes.

Get-Mailbox -ResultSize Unlimited | select @{ Name = 'Identity';  Expression = {$_.primarysmtpaddress}} | Get-MailboxStatistics | Select DisplayName, @{name=”TotalItemSize”;expression={[math]::Round($($_.totalitemsize.Value.ToString().replace(",","").split("(")[1].split(" bytes")[0])/1GB,2)}} | Where {$_.TotalItemSize -gt 45} | ft -auto

Breakdown of the code

Get-Mailbox gets all the mailboxes available, there’s no search filter on this one.

Get-Mailbox -ResultSize Unlimited

The select statement is used for doing a select expression where we can transform the Identity parameter that is being used as pipeline input for the Get-MailboxStatistics. I did this mainly because we have duplicate display names and differing email addresses in this specific tenant.

| select @{ Name = 'Identity';  Expression = {$_.primarysmtpaddress}}

Get-MailboxStatistics accept pipeline input of the default variable Identity. By doing the select statement above, we’re now using -Identity via the pipeline using the primary SMTP email address. This command outputs the data about a mailbox.

| Get-MailboxStatistics

This next select statement gives us the TotalItemSize formatted for use with comparison.

| Select DisplayName, @{name=”TotalItemSize”;expression={[math]::Round($($_.totalitemsize.Value.ToString().replace(",","").split("(")[1].split(" bytes")[0])/1GB,2)}}

This part of the select statement is broken down as follows:
The first part is the beginning of a select expression, the { is the start of the expression that will now become TotalItemSize.

@{name="TotalItemSize";expression={}}

Next up, we have the [math] function, we use this because by doing the simple math, we’d have a large amount of decimal places, so we round it.

[math]::Round()

Inside the () for the Round, we have this expression

$_.totalitemsize.Value.ToString().replace(",","").split("(")[1].split(" bytes")[0])/1GB

The TotalItemSize is modified to become a string

$_.totalitemsize.Value.ToString()

then we replace the commas in the string

.replace(",","")

split it at the first ( and grab the second part of the array

.split("(")[1]

then split again at bytes and grab the first part of that array

.split(" bytes")[0]

Finally we finish the Round() statement with a ,2 which gives us 2 decimal places rounded.

The next pipeline sets the where-object and we’re only concerned about mailboxes over 45 GB in this example. You can change this to whatever you’d like to filter based on.

| Where {$_.TotalItemSize -gt 45}

Phew, this one took a minute to figure out. ConnectWise has a form based documents API (technically not really API, but it’s the way you get yourself a document into a CW ticket). First is the really amazing documentation that CW provides around the documents API

Second is then working with PowerShell to handle streamed encoding correctly, build a multipart form data payload, and then getting it to actually send the correct thing. Ultimately there were some good learning steps here. Mainly on how to construct a proper content type of “multipart/form-data”. I’m writing this in hopes that many of you that are out there that are facing a similar challenge on getting documents to upload into CW via PowerShell aren’t faced with the same 2 day challenge I just had.

Encoding Issues

Mainly the Encoding Issues were around reading a file into PowerShell and then using Invoke-RestMethod to send it off. Typically you’d work in UTF-8, while that’s great in PS when working, sending that encoding via the Invoke-RestMethod seems to break things a little and none of the characters are correct, thus resulting in a data stream sent to your destination being garbled.

Left – Proper Data | Right – Garbled Data

I happened to stumble, and by stumble, I’ve been searching the Google masters for quite a while trying to understand why the encoding wasn’t working correctly, upon this article: https://social.technet.microsoft.com/Forums/en-US/26f6a32e-e0e0-48f8-b777-06c331883555/invokewebrequest-encoding?forum=winserverpowershell

which nicely pointed me here:
https://windowsserver.uservoice.com/forums/301869-powershell/suggestions/13685217-invoke-restmethod-and-invoke-webrequest-encoding-b

Taking from this, I modified the following from:

$fileEnc = [System.Text.Encoding]::GetEncoding('UTF-8').GetString($fileBytes);

To using the ISO 8859-1 encoding type of 28591. Converting this line to:

$fileEnc = [System.Text.Encoding]::GetEncoding(28591).GetString($fileBytes);

The rest of the time was learning to deal with boundaries in a multipart/form-data payload. Essentially finding this article:
https://gist.github.com/weipah/19bfdb14aab253e3f109

This taught me a bit about the boundaries that need to be set and more-so having to use “`r`n” in different places, you’ll see this referenced the same way as in the link in my script below using the “$LF” variable.

Enjoy, here’s the full code layout:

###INITIALIZATIONS###
$global:CWcompany    = "xxxcompanyname"
$global:CWprivate    = "xxxprivatekey"
$global:CWpublic     = "xxxpublickey"
$global:CWserver     = "https://na.myconnectwise.net/v4_6_release/apis/3.0/system/documents"
##don't use the api- url here for the server##
###CW AUTH STRING###
[string]$Authstring  = $CWcompany + '+' + $CWpublic + ':' + $CWprivate
$encodedAuth         = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(($Authstring)));

###CW HEADERS###
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", "Basic $encodedAuth")

$FilePath = 'c:\users\Tom\Desktop\image001.jpg'
$fileBytes = [System.IO.File]::ReadAllBytes($FilePath);
$fileEnc = [System.Text.Encoding]::GetEncoding(28591).GetString($fileBytes);
$boundary = [System.Guid]::NewGuid().ToString(); 
$LF = "`r`n";

$bodyLines = ( 
    "--$boundary",
    "Content-Disposition: form-data; name=`"recordType`"$LF",
    "Ticket",
    "--$boundary",
    "Content-Disposition: form-data; name=`"recordId`"$LF",
    "6956",
    "--$boundary",
    "Content-Disposition: form-data; name=`"Title`"$LF",
    "testingFINAL",
    "--$boundary",
    "Content-Disposition: form-data; name=`"file`"; filename=`"image001.jpg`"",
    "Content-Type: application/octet-stream$LF",
    $fileEnc,
    "--$boundary--$LF" 
) -join $LF

Invoke-RestMethod -Uri $CWserver -Method Post -ContentType "multipart/form-data; boundary=`"$boundary`"" -Body $bodyLines -Headers $headers

I’ve found myself at a new job, recreating many of the processes that I spent the last few years putting together, tweaking, modifying, building a new managed services provider with an exciting new company. One of those challenges lead me to Email parsing and ConnectWise (CW). Previously I had the opportunity to use Autotask and email2ticket, however with the modifications to the API that CW did a few years ago, email2ticket is no longer supported for CW. With my found love of Azure Functions and looking at existing mail based parsing tools (https://mailparser.io/, https://www.thinkautomation.com/) which have amazing feature sets, they didn’t do quite what I was looking for and to replace the functionality that I once had with email2ticket.

That is leading me to this PowerShell series for how to utilize the CW REST API and the things that took some understanding and digging in a little to determine how to do simple queries. The CW developer portal has some great resources, and I stumbled upon the forums that ultimately made it possible to finally build a wildcard query via PowerShell to identify whether a contact exists in CW.

First things first, you have to authenticate to the CW REST API. This requires you to generate a CW API Access Account and you need access to the Admin Setup tables to do so.

CW Members Tab – API Members

Once you have an integration setup you can proceed with creating a PowerShell script to handle the automation. I build in Azure Functions mostly, so there will be some pieces in here that relate to that, I’ll breakdown each section (and eventually move some of these pieces to linked articles).

Authentication

Authentication to the CW rest API is fairly simple. It requires your pubilc and private keys and your company identifier.

First set your variables for your credentials.

$global:CWcompany    = "company"
$global:CWprivate    = "privatekey"
$global:CWpublic     = "publickey"
$global:CWserver     = "https://api-na.myconnectwise.net"

Second configure the authentication string and setup the standard headers for your GET request.

[string]$Accept      = "Accept: application/vnd.connectwise.com+json; version=3.0"
[string]$ContentType = 'application/json'
[string]$Authstring  = $CWcompany + '+' + $CWpublic + ':' + $CWprivate
$encodedAuth         = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(($Authstring)));

$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", "Basic $encodedAuth")
$headers.Add("Content-Type", 'application/json')
$headers.Add("Accept", $Accept)

Then for the URI, this consists of a few parts, your base URL, your query parameters, and your target for your request.

Query String – this is the conditions you’re going to pass to do the lookup. In this case I’m looking for the email address of a contact. This requires the use of CW’s childconditions parameters. Initially the communicationItems is used to determine the value of the email address, then forcing it to only look for the type of email address to speed the query result. The “%” is used as the wildcard for the “like” operator. I spent some time trying to determine what would work best, contains and in both resulted in invalid syntax so using the like operator was the eventual conclusion.

[string]$query       = '?childconditions=communicationItems/value like "%' + $email + '%" AND communicationItems/communicationType="Email"'

Putting it all together, you have a target of /company/contacts and use the query string. $email is the full or partial email address you’re searching for. Benefits of using a partial, such as @xyz.corp, would yeild all contacts for that domain and you can use some logic there to determine what company they belong to (more to come on that subject).

[string]$TargetUri   = '/company/contacts'
[string]$query       = '?childconditions=communicationItems/value like "%' + $email + '%" AND communicationItems/communicationType="Email"'
[string]$BaseUri     = "$CWserver" + "/v4_6_release/apis/3.0" + $TargetUri + $query

Finally, send the Invoke-RestMethod command to get the results. This returns a JSON table that Invoke-RestMethod converts to a PS Object.

$JSONResponse = Invoke-RestMethod -URI $BaseURI -Headers $Headers -ContentType $ContentType -Method Get

Here’s the full code that I’m using to query for a specific contact’s email address.

# GET method: each querystring parameter is its own variable
if ($req_query_email) 
{
    $email = $req_query_email
}

###INITIALIZATIONS###
$global:CWcompany    = "company"
$global:CWprivate    = "privatekey"
$global:CWpublic     = "publickey"
$global:CWserver     = "https://api-na.myconnectwise.net"

###CW AUTH STRING###
[string]$Accept      = "Accept: application/vnd.connectwise.com+json; version=3.0"
[string]$Authstring  = $CWcompany + '+' + $CWpublic + ':' + $CWprivate
[string]$ContentType = 'application/json'
$encodedAuth         = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(($Authstring)));

###CW HEADERS###
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", "Basic $encodedAuth")
$headers.Add("Content-Type", 'application/json')
$headers.Add("Accept", $Accept)

###CW QUERY###
[string]$TargetUri   = '/company/contacts'
[string]$query       = '?childconditions=communicationItems/value like "%' + $email + '%" AND communicationItems/communicationType="Email"'
[string]$BaseUri     = "$CWserver" + "/v4_6_release/apis/3.0" + $TargetUri + $query

###GET RESPONSE###
$JSONResponse = Invoke-RestMethod -URI $BaseURI -Headers $Headers -ContentType $ContentType -Method Get

###PARSE CONTACT INFO TO USABLE SHORT TABLE###
$contactInfo = @()
foreach($contact in $JSONResponse){
    $email = $null; $emails = $null
	$obj = New-Object PSObject
	$obj | Add-Member -MemberType NoteProperty -Name "id" -Value $contact.id
	$obj | Add-Member -MemberType NoteProperty -Name "firstName" -Value $contact.firstName
	$obj | Add-Member -MemberType NoteProperty -Name "lastName" -Value $contact.lastName
    foreach ($commtype in $contact.communicationItems) {
        $email = $($commtype | Where-Object {$_.communicationType -eq "Email"}).value
        if ($email.length -gt 2) {$emails += $email + ";"}
    }
    $obj | Add-Member -MemberType NoteProperty -Name "emails" -Value $emails
    $obj | Add-Member -MemberType NoteProperty -Name "company" -Value $contact.company.name
    $obj | Add-Member -MemberType NoteProperty -Name "companyid" -Value $contact.company.id
    $obj | Add-Member -MemberType NoteProperty -Name "companyidentifier" -Value $contact.company.identifier
	$contactInfo += $obj
}

If($contactInfo)
{
    Out-File -Encoding Ascii -FilePath $res -inputObject $($contactInfo | ConvertTo-Json)
}

Else
{
    Return $False
}

Wow, it’s been a while since I’ve done a real post on this site. I’ve got many interesting things to discuss, but it has been a great last few years. More to come, but mainly, I’m just moved hosting providers and onto a VPS server.

It’s been a while since I’ve posted. Way too long. I’ve had this script for quite a while that I wanted to share with the world. LogicMonitor is releasing a new REST API which requires some session based login. This script helps you obtain that session and download the audit log for the last hour. You’ll have to modify your timezone settings (in the AddHours lines, currently set for EST).

Here’s the script:

$user = "username"
$pass= "P@ssw0rd"

#get epoch time for current and x hours before
$date1 = Get-Date -Date "01/01/1970"
#get start time
$date2 = (Get-Date).AddHours(4)
$epochStart= (New-TimeSpan -Start $date1 -End $date2).TotalSeconds
#get end time
$date2 = (Get-Date).AddHours(5)
$epochEnd= (New-TimeSpan -Start $date1 -End $date2).TotalSeconds
#round the time to not have decimals
$epochStart= [math]::Round($epochStart)
$epochEnd= [math]::Round($epochEnd)

$filter = "_all~update" #check LM documentation on filters
$fields = "username,happenedOnLocal,description"
#build uri for access logs
$uri = "https://{account}.logicmonitor.com/santaba/rest/setting/accesslogs?sort=-happenedOn&filter=$filter,happenedOn>:$epochStart&fields=$fields"
#build base64Auth for the header
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$pass)))
#get the events
$events = Invoke-RestMethod -Headers @{Authorization=("Basic {0}" -f $base64AuthInfo)} -Uri $uri
$events #display events that were gathered

When using AutoTask’s API it’s required to lookup a various amount of picklist values that are used in updating you’re web request. This is a powershell way to pull those picklist values. The first part of the script validates your AT URI, the second part gets the entity data, in this case I was looking for “ticket” related fields.

Edit: Updated to give a more friendly output

# Username and password for Autotask
$ATurl = "https://webservices1.autotask.net/atservices/1.5/atws.wsdl"
$ATusername = "{AT Username}"
$ATpassword = ConvertTo-SecureString "{AT Password}" -AsPlainText -Force
$ATcredentials = New-Object System.Management.Automation.PSCredential($ATusername,$ATpassword)

$atws = New-WebServiceProxy -URI $ATurl -Credential $ATcredentials
$zoneInfo = $atws.getZoneInfo($ATusername)
$ATurl = $zoneInfo.URL.Replace(".asmx",".wsdl")
$atws = New-WebServiceProxy -URI $ATurl -Credential $ATcredentials
 
$entity= $atws.getFieldInfo("Ticket")
 
foreach ($picklist in $entity) {
	$picklist | select Name,Label,Description | ft
	foreach ($values in $picklist.PicklistValues) { $values | select Label,Value,IsActive }
}

$output= $atws.getThresholdAndUsageInfo()
$output.EntityReturnInfoResults.message

This is a simple script to gather volume information including dedupe schedule and autogrow settings. I’m going to combine this with my snapshot script in the future to make a recommended dedupe schedule based on the average snapshot times.
Read More

Found a solution on the NetApp communities for deleting the Informational events that plague OnCommand Core.

https://communities.netapp.com/message/94591#94591

Read More

Problem:

Gathering snapshot statistics is a tedious task when looking at autosupports and cli output. I needed to gather information about oldest snapshot, average number of snaps per day, total snapshots, and other various information.

Solution:

A powershell script using the Data ONTAP PS Library. Read more for the script.

Read More