Create multiple hybrid migration batches with PowerShell

Sometimes in Exchange Online migration projects you have the demand to make not few big batches but many small ones. In the end this means you have many CSV files which you want to use for bulk creation of migration batches.

You can either create them by using the GUI (which is not really fun when you have more than 10 CSV files) or by using this tiny PowerShell command. You have to replace the highlighted values with your own ones:

Get-ChildItem *.csv  | Foreach-Object{ New-MigrationBatch -Name ($_.Name -replace ".csv","") -TargetDeliveryDomain "TENANTNAME.mail.onmicrosoft.com" -AutoStart -AllowUnknownColumnsInCsv $true -NotificationEmails "" -CSVData ([System.IO.File]::ReadAllBytes( $_.FullName)) -BadItemLimit 99999 -LargeItemLimit 99999 -AllowIncrementalSyncs $true -SourceEndpoint "NAME OF YOUR HYBRID ENDPOINT"}

 

This command searches for all CSV files in the current folder and creates a migration batch for each CSV file with the following attributes:

  • The name of the batch will be the file name of the CSV file without the file extension
  • The batch will start automatically but has to be completed manually
  • The notification emails will be sent to the email you provide here
  • The CSV files may have any column, but the column “EmailAddress” must be present
  • The batches will perform incremental syncs
  • the LargeItemLimit and BadItemLimit are very high to ensure the mailboxes will not skip because of items that can’t be migrated
  • The hybrid endpoint of your organization will be taken for the move

Of course you may adjust this command depending on your needs, e.g. Auto Completion or a lower BadItemLimit.

In my case there was a limit of 100 migration batches. The Exchange Online Service Description doesn’t mention this limit, but be aware of the fact that this may hit you as well.

Happy migrating 🙂

Disable OWA attachment download

Some companies’ security policies recommend that it must be ensured that no company data will be saved on “non-company” devices. A first step to achieve that is to disable attachment download.  To do this, you can just remove the checkbox in ECP:

OWA file access

You can also create a new OWA policy and specify the following:

OWA file access2

If you are more the PowerShell Guy:

Get-OwaMailboxPolicy | Set-OwaMailboxPolicy -DirectFileAccessOnPublicComputersEnabled $false -DirectFileAccessOnPrivateComputersEnabled $false

The result is that the attachments cannot be downloded any more:

OWA file access3

The cool thing is that viewing attachments in Office Online is still possible.

There are more features which can be disabled to gain more security which will be discussed in separate articles.

Disconnect inactive RDP sessions

Many admins know the problem: they don’t have terminal servers for administration, so there are 2 RDP sessions to a server possible. And when connecting to  a server there are either no free sessions or many disconnected (not “logged-off”) sessions which are cosuming resources which is slowing down the server.

If you only want to see the sessions, you can execute only this:

query user | select -Skip 1 | ? {($_ -split "\s+")[-4] -eq 'Disc'} 

 

To disconnect all these sessions at once, you can execute this command inside an elevated PowerShell window:

query user | select -Skip 1 | ? {($_ -split "\s+")[-4] -eq 'Disc'} | % {logoff ($_ -split "\s+")[-5] /v}

 

 

And then all disconnected RDP sessions are forced logoff and the resources are free.

 

ADFS Proxy – An error occurred when attempting to establish a trust relationship with the federation service

This is a really weird and annoying error which can drive you crazy. But let’s start from the beginning. So what do we have?

  • An Office 365 tenant
  • An ADFS server in the internal network
  • An ADFS Proxy (a WAP) in the perimeter network
  • a wildcard certificate which was issued by a public CA

So up to not nothing special. The ADFS server configures well and is up & running. The firewall between the ADFS and the ADFS proxy was opened on port 443 so that these both can communicate with each other. So I’ve started the configuration of the WAP server, entering all the necessary data and then this error raised:

WAP Error GUI

Trying the configuration with PowerShell didn’t work better:

WAP Error Shell

So the first view was to the eventlog of the machines. What did I see?

  • On the ADFS proxy: No entries – neither in the Application not the ADFS eventlog. Yeah. *Happiness*
  • On the ADFS server: Event ID 364 with not helpful descriptions like this:
    • Encountered error during federation passive request. […] Contact your administrator for details – and a long stacktrace

Opening one of the ADFS websites from the ADFS proxy the following error raises:

ADFS TLS

It seems that the certificate is not presented well from the ADFS to the WAP and the error message in Internet Explorer is useless.

So.. what to do next? Start to google and analyze the traffic using fiddler. To shorten the story a little bit: In the fiddler logs I could see there is a problem with the certificate, but this may also be related to fiddlers SSL-decryption feature.

Googling around this error brings up a ton of tips and tricks what it could be, for example:

  • Certificate error: https://support.microsoft.com/en-us/help/3044974
  • About the Cipher Suites:
    • https://support.microsoft.com/en-us/help/3194197/considerations-for-disabling-and-replacing-tls-1.0-in-adfs
    • http://s4b-usergroup.com/office365-blog/adfs-3-0-tls-error/
    • really cool explanation: https://blogs.technet.microsoft.com/keithab/2015/06/22/error-while-configuring-wapthe-underlying-connection-was-closedpart-2/

But all this did not work.

 

So after 4 days of troubleshooting, re-installing and investigating I decided to begin from scratch and check each and every point again. And at this point the firewall guys told me: “Yes port 443 is open. Yes we have content inspection running”.

And here we go. Disabling content inspection solved the problem for this issue, now the ADFS and the WAP can communicate with each other. And the morale of this course: doublecheck with the firewall guys and the network security instruments. This can save you a lot of days for troublehsooting 🙂

Using PowerShell to connect to Lotus Notes COM object

Sometimes you have to query and/or change certain data in a Domino Directory. There are in general 2 options to connect to Lotus Notes:

  1. via LDAP
  2. via the Notes COM Object

Querying via LDAP is a standardized and easy-to-understand way. But it has some disadvantages:

  • you cannot perform write operations
  • multi-value fields are hard to identify

So the alternative is using the COM object. For this the above mentioned disadvantages do not count. So the question is now – how to do this in PowerShell??

First of all, what you need is an installed Lotus Notes Client on the machine which will run the script. This Lotus Notes Client must be installed in Single User Mode (no Multi-User!) to work with the examples below. Single User install means that the Notes-Data folder is stored as a subfolder of the Notes installation directory, not in the users’s profile. Next I highly recommend a rich editor. I personally prefer Sapien PowerShell Studio, but others like the integrated ISE or PowerGUI work as well.

Before you can query anything, you must make sure, that the PowerShell window is running in 32-bit mode. There are several ways to do that and I saw each work or not – depending on client Operation system, .NET Framework versions, local permissions, … So try out what’s working best for you.

Option 1:

$PowerShellRunSpace = [System.Runtime.InterOpServices.Marshal]::SizeOf([System.IntPtr])

if (($PowerShellRunSpace -ne 4) -and ($PowerShellRunSpace -ne 2)) {exit} #because you have a 64-bit PowerShell

Option 2:

if([Environment]::Is64BitProcess -eq $true) {exit} #because you have a 64-bit PowerShell

Now after that is done, we can start connecting to Notes. To do that you have to understand how Notes “thinks” and works in the background. That’s the way the client behaves, too. To be able to work with Notes you need a Notes Session which can be opened by using your Notes ID and the corresponding password. Once this is open, you can do anything, which your ID is allowed to do, that means, open Notes databases, create Notes databases, read and modify documents, …. And how does all this work? With some kind of in-line LotusScript! Yes! That means if you can develop Lotus Script and PowerShell, then you can do anything with PowerShell which you can do with Lotus Script. So let’s look how the connection to the Domino Directory works:

$NotesSession = New-Object -ComObject Lotus.NotesSession #open the Notes Session

$NotesSession.Initialize(“mySecretNotesPassword”) #provide the password for the ID file

#$NotesSession.Initialize() #if no password is provided, Notes will ask for the password

$NotesAdressbook = $NotesSession.GetDatabase(“Domino01/NotesOrg”, names.nsf, 0)

You see the above Script code is Lotus Script which opens the names.nsf file from the Domino Server “Domino01/NotesOrg”. Now you can continue open Views, Documents, edit them, save your changes and do anything you want.

The COM object is really cool, but it has one very big disadvantage: It is very slow. So I highly recommend to use functions like GetDocumentByKey() and GetAllDocumentsByKey() as often as possible. This will drastically increase the performance of your script.

Happy Notes-PowerShell-Scripting!