Connecting to a SAP WebService with Powershell

Now, the title says SAP but this works for any WebService but the datatypes returned will be different. I was surprised how easy and powerful this is I was recently working on a SAP project so I knocked together a quick Powershell POC script to troubleshoot some problems I was having. You could use a script like this in the MIMWAL workflow or in a Powershell Management agent if coding an extensible MA gives you shudders and you need to connect to a SAP system via WebServices.

So you need your WSDL url or file before you begin and an idea of some of the Datatypes and functions/methods sat behind that the below example is using a couple of custom functional BAPI’s created this will be different to your particular environment.


$webSites = get-WebSite
$global:index=-1
$webSites |  Format-Table -Property @{name="index";expression={$global:index;$global:index+=1}},name
$sitenameindex = read-host -Prompt "Enter Site index"
$sitename = $webSites[$sitenameindex].name

try
{
$RuleName = "HTTPS Redirect"
$Rule = @{
 Name = $RuleName
 patternSyntax = 'ECMAScript'
 stopProcessing = 'True'
 match = @{
  url = '(.*)'
  ignoreCase = 'True'
  negate = 'False'
 }
 conditions = @{
  logicalGrouping = 'MatchAll'
  trackAllCaptures = 'True'
 }
 action = @{
  type = 'Redirect'
  url = 'https://{HTTP_HOST}/{R:1}'
  appendQueryString = 'False'
  redirectType = 'Permanent'
 }
}
Add-WebConfigurationProperty -PSPath "IIS:\Sites\$SiteName" -Filter "/system.webServer/rewrite/rules" -Name "." -Value $Rule 
$match = @{
 input = '{HTTPS}'
 matchType = 'Pattern'
 pattern = 'off'
 ignoreCase = 'True'
 negate = 'False'
}
Add-WebConfigurationProperty -PSPath "IIS:\Sites\$SiteName" -Filter "/system.webServer/rewrite/rules/rule[@Name='$RuleName']/conditions" -Name "." -Value $match


$RuleName = "Redirect to MIM Site"
$Rule = @{
 Name = $RuleName
 patternSyntax = 'ECMAScript'
 stopProcessing = 'True'
 match = @{
  url = '^$'
  ignoreCase = 'True'
  negate = 'False'
 }
 action = @{
  type = 'Redirect'
  url = '/IdentityManagement/default.aspx'
  appendQueryString = 'False'
  redirectType = 'Permanent'
 }
}
Add-WebConfigurationProperty -PSPath "IIS:\Sites\$SiteName" -Filter "/system.webServer/rewrite/rules" -Name "." -Value $Rule
}
catch
{
Write-Host "There was a problem............." -ForegroundColor Red
write-host $_.Exception.Message -ForegroundColor Red
exit
}
Write-Host "$sitename has been updated successfully...........Enjoy!" -ForegroundColor Green




This is a very simple example that shows how easy it is with some custom function SAP BAPI’s created, now it would be even easier if there was one function that returned the user details in one go. I would always insist on this if I was doing a project for a customer. However that inevitably requires external resource of a SAP consultant so becomes problematic, the default SAP BAPI’s are horrendously complicated with nested data tables and multiple calls so try and avoid them like the plague.

The Microsoft documentation on this subject is pretty good for a change and can be found here.

Errors running Azure PowerShell cmdlets in MIMWAL workflow

 

Just a quick one as this has been discussed on the net before but if you are trying to run Azure PowerShell cmdlets in a MIMWAL workflow or embedded in a custom work flow and get the below error:

Import-Module : Could not load file or assembly ‘file:///C:\Windows\system32\WindowsPowerShell\v1.0\Modules\MSOnline\Microsoft.Online.Administration.Automation.PSModule.dll’ or one of its dependencies. This assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded.

This is because the MIMWAL is built on the .net framework 3.5 whereas the later Azure PowerShell cmdlets are built on .net 4, if you can live with the older functionality the easy fix is to install version 1 of the cmdlets 64 bit from here or here for the 32 bit versions these are build 8362.1 so getting a bit old but have most of the required functionality. The whole list of versions is here but anything later than 8362.1 will not work. The other option is to take your script out of the workflow and trigger it externally.

MIM HTTPS redirect and Site Pages redirect

Everyone always needs to do this right? I used to do this with a SharePoint homepage but using the URL Rewrite module is a much neater solution and portable. So, URL Rewrite is an addon module that you can add from the Web Platform Installer here. Install that and search for URL Rewrite and install it.

Then use the below powershell code to add https and redirect to your mim site, the script will query all the sites then ask you to enter an index corresponding to the site…..Macho Ninja!


$webSites = get-WebSite
$global:index=-1
$webSites |  Format-Table -Property @{name="index";expression={$global:index;$global:index+=1}},name
$sitenameindex = read-host -Prompt "Enter Site index"
$sitename = $webSites[$sitenameindex].name

try
{
$RuleName = "HTTPS Redirect"
$Rule = @{
 Name = $RuleName
 patternSyntax = 'ECMAScript'
 stopProcessing = 'True'
 match = @{
  url = '(.*)'
  ignoreCase = 'True'
  negate = 'False'
 }
 conditions = @{
  logicalGrouping = 'MatchAll'
  trackAllCaptures = 'True'
 }
 action = @{
  type = 'Redirect'
  url = 'https://{HTTP_HOST}/{R:1}'
  appendQueryString = 'False'
  redirectType = 'Permanent'
 }
}
Add-WebConfigurationProperty -PSPath "IIS:\Sites\$SiteName" -Filter "/system.webServer/rewrite/rules" -Name "." -Value $Rule 
$match = @{
 input = '{HTTPS}'
 matchType = 'Pattern'
 pattern = 'off'
 ignoreCase = 'True'
 negate = 'False'
}
Add-WebConfigurationProperty -PSPath "IIS:\Sites\$SiteName" -Filter "/system.webServer/rewrite/rules/rule[@Name='$RuleName']/conditions" -Name "." -Value $match


$RuleName = "Redirect to MIM Site"
$Rule = @{
 Name = $RuleName
 patternSyntax = 'ECMAScript'
 stopProcessing = 'True'
 match = @{
  url = '^$'
  ignoreCase = 'True'
  negate = 'False'
 }
 action = @{
  type = 'Redirect'
  url = '/IdentityManagement/default.aspx'
  appendQueryString = 'False'
  redirectType = 'Permanent'
 }
}
Add-WebConfigurationProperty -PSPath "IIS:\Sites\$SiteName" -Filter "/system.webServer/rewrite/rules" -Name "." -Value $Rule
}
catch
{
Write-Host "There was a problem............." -ForegroundColor Red
write-host $_.Exception.Message -ForegroundColor Red
exit
}
Write-Host "$sitename has been updated successfully...........Enjoy!" -ForegroundColor Green


The reference for the re-write module is here. Be careful when using permamnent as your redirectType though if you are testing this out it may be worth using temporary as when you use Permanent the rule sticks even when you restart/hard refresh the browser. You either have to clear your cache or start in private browsing for the new change to appear which took me a while to work out why my perfectly corrected rule was still using an old incorrect rule when debugging.

FIMDelta automation

FIMDelta is a great tool for comparing 2 configurations and showing the differences so you can pick and choose the differences. However, extracting the configurations and putting them into the correct folders so that FIMDelta can make a comparison is cumbersome. So in true ninja fashion I have automated it! The following script extracts the configurations, creates folders and places the configs appropriately. It then downloads a copy of FIMDelta and places it in each folder, all you have to do is run FIMDelta!


$schemaFiles = @{}
$schemaFiles.Add("http://dev.mim.ninja:5725/ResourceManagementService","dev.xml")
$schemaFiles.Add("http://prod.mim.ninja:5725/ResourceManagementService","prod.xml")
$folder = (Get-Item -Path ".\" -Verbose).FullName
New-Item "$folder\Schema" -ItemType directory
New-Item "$folder\Policy" -ItemType directory
$creds = Get-Credential
if(@(get-pssnapin | where-object {$_.Name -eq "FIMAutomation"} ).count -eq 0) {add-pssnapin FIMAutomation}

foreach($schemaFile in $schemaFiles.GetEnumerator())
{
$schema_filename = "$folder\Schema\$($schemaFile.Value)"
Write-Host "Exporting configuration objects from pilot."
# Please note that SynchronizationFilter Resources inform the FIM MA.
$schema = Export-FIMConfig -schemaConfig -customConfig "/SynchronizationFilter" -Uri $schemaFile.Name -Credential $creds
if ($schema -eq $null)
{
    Write-Host "Export did not successfully retrieve configuration from FIM.  Please review any error messages and ensure that the arguments to Export-FIMConfig are correct."
}
else
{
    Write-Host "Exported " $schema.Count " objects."
    $schema | ConvertFrom-FIMResource -file $schema_filename
    Write-Host "Schema file is saved as " $schema_filename "."
}
}
foreach($schemaFile in $schemaFiles.GetEnumerator())
{
$policy_filename = "$folder\Policy\$($schemaFile.Value)"
Write-Host "Exporting configuration objects from pilot."
# In many production environments, some Set resources are larger than the default message size of 10 MB.
$policy = Export-FIMConfig -policyConfig -portalConfig -MessageSize 9999999 -Uri $schemaFile.Name -Credential $creds
if ($policy -eq $null)
{
    Write-Host "Export did not successfully retrieve configuration from FIM.  Please review any error messages and ensure that the arguments to Export-FIMConfig are correct."
}
else
{
    Write-Host "Exported $($policy.Count) objects from pilot."
    $policy | ConvertFrom-FIMResource -file $policy_filename
    Write-Host "Policy file is saved as " $policy_filename "."
}
}


#######SYNC SCHEMA###########

$pilot_filename = "$folder\Schema\dev.xml"
$production_filename = "$folder\Schema\prod.xml"
$changes_filename = "$folder\Schema\changes.xml"
$joinrules = @{
    # === Schema configuration ===
    # This is based on the system names of attributes and objects
    # Notice that BindingDescription is joined using its reference attributes.
    ObjectTypeDescription = "Name";
    AttributeTypeDescription = "Name";
    BindingDescription = "BoundObjectType BoundAttributeType";
}

Write-Host "Loading production file " $production_filename "."
$production = ConvertTo-FIMResource -file $production_filename
if($production -eq $null)
{
    throw (new-object NullReferenceException -ArgumentList "Production Schema is null.  Check that the production file has data.")
}

Write-Host "Loaded file " $production_filename "." $production.Count " objects loaded."

Write-Host "Loading pilot file " $pilot_filename "."
$pilot = ConvertTo-FIMResource -file $pilot_filename
if($pilot -eq $null)
{
    throw (new-object NullReferenceException -ArgumentList "Pilot Schema is null.  Check that the pilot file has data.")
}

Write-Host "Loaded file " $pilot_filename "." $pilot.Count " objects loaded."
Write-Host
Write-Host "Executing join between pilot and production."
Write-Host 
$matches = Join-FIMConfig -source $pilot -target $production -join $joinrules -defaultJoin DisplayName
if($matches -eq $null)
{
    throw (new-object NullReferenceException -ArgumentList "Matches is null.  Check that the join succeeded and join criteria is correct for your environment.")
}
Write-Host "Executing compare between matched objects in pilot and production."
$changes = $matches | Compare-FIMConfig
if($changes -eq $null)
{
    throw (new-object NullReferenceException -ArgumentList "Changes is null.  Check that no errors occurred while generating changes.")
}
Write-Host "Identified " $changes.Count " changes to apply to production."
Write-Host "Saving changes to " $changes_filename "."
$changes | ConvertFrom-FIMResource -file $changes_filename
Write-Host
Write-Host "Sync Schema complete.."

#################SYNC POLICY#######################

$pilot_filename = "$folder\Policy\dev.xml"
$production_filename = "$folder\Policy\prod.xml"
$changes_filename = "$folder\Policy\changes.xml"
$joinrules = @{
    # === Customer-dependent join rules ===
    # Person and Group objects are not configuration will not be migrated.
    # However, some configuration objects like Sets may refer to these objects.
    # For this reason, we need to know how to join Person objects between
    # systems so that configuration objects have the same semantic meaning.
    Person = "MailNickname DisplayName";
    Group = "DisplayName";
    
    # === Policy configuration ===
    # Sets, MPRs, Workflow Definitions, and so on. are best identified by DisplayName
    # DisplayName is set as the default join criteria and applied to all object
    # types not listed here.
    
    # === Schema configuration ===
    # This is based on the system names of attributes and objects
    # Notice that BindingDescription is joined using its reference attributes.
    ObjectTypeDescription = "Name";
    AttributeTypeDescription = "Name";
    BindingDescription = "BoundObjectType BoundAttributeType";
    
    # === Portal configuration ===
    ConstantSpecifier = "BoundObjectType BoundAttributeType ConstantValueKey";
    SearchScopeConfiguration = "DisplayName SearchScopeResultObjectType Order";
    ObjectVisualizationConfiguration = "DisplayName AppliesToCreate AppliesToEdit AppliesToView"
}

Write-Host "Loading production file " $production_filename "."
$production = ConvertTo-FIMResource -file $production_filename
if($production -eq $null)
{
    throw (new-object NullReferenceException -ArgumentList "Production Schema is null.  Check that the production file has data.")
}

Write-Host "Loaded file " $production_filename "." $production.Count " objects loaded."

Write-Host "Loading pilot file " $pilot_filename "."
$pilot = ConvertTo-FIMResource -file $pilot_filename
if($pilot -eq $null)
{
    throw (new-object NullReferenceException -ArgumentList "Pilot Schema is null.  Check that the pilot file has data.")
}

Write-Host "Loaded file " $pilot_filename "." $pilot.Count " objects loaded."
Write-Host
Write-Host "Executing join between pilot and production."
Write-Host 
$matches = Join-FIMConfig -source $pilot -target $production -join $joinrules -defaultJoin DisplayName
if($matches -eq $null)
{
    throw (new-object NullReferenceException -ArgumentList "Matches is null.  Check that the join succeeded and join criteria is correct for your environment.")
}
Write-Host "Executing compare between matched objects in pilot and production."
$changes = $matches | Compare-FIMConfig
if($changes -eq $null)
{
    throw (new-object NullReferenceException -ArgumentList "Changes is null.  Check that no errors occurred while generating changes.")
}
Write-Host "Identified " $changes.Count " changes to apply to production."
Write-Host "Saving changes to " $changes_filename "."
$changes | ConvertFrom-FIMResource -file $changes_filename
Write-Host
Write-Host "Policy Sync complete. ....."
Write-Host "Sync and Policy export and comparison complete....Will download FIMDELTA if required"
##Download FIMDELTA and copy to both folders
$url = "https://github.com/pieceofsummer/FIMDelta/raw/master/build/FimDelta.exe"
if(-not(Test-Path "$folder\Schema\FimDelta.exe"))
{
Write-Host "Starting download of FIMDELTA......"
Start-BitsTransfer -Source $url -Destination "$folder\Schema\FimDelta.exe"
Copy-Item -Path "$folder\Schema\FimDelta.exe" -Destination "$folder\Policy\FimDelta.exe"
Write-Host "Download succesfull......Script End"
}
else
{
Write-Host "FIMDELTA already present aborting........"
}

The astute of you will notice its mostly just a butcher of the existing sync scripts with a little bit of ninjaness added.

Installing MIM on SharePoint 2016

The online docs don’t exactly get it right when it comes to installing MIM on SharePoint 2016, the compatibility is set for SharePoint2010 that was the level FIM supported. However this isn’t available in SharePoint 2016 and moreover its not needed as MIM supports it out of the box. I also like to specify my databases on the command line so I can make them all uniform. Here is the config I use:

So run the normal SharePoint 2016 install but don’t run the config wizard after instead run these commands:

psconfig.exe -cmd configdb -create -server fim-sql -database SharePoint_central_config -user domain\mimsp -password mypassword -passphrase mypassphrase -admincontentdatabase SharePoint_admin_content -localserverrole SingleServerFarm

So I have specified the database and the server role makes things a little tidier, also always use a SQL alias this is a ninja technique and will make life so much easier in a DR or server move situation.

Now we create the MIM portal site:

new-spmanagedaccount

You may not need the above command if you are using the same account as specified in the initial command as that puts it’s a farm admin.

The online install docs assume you are installing on SP2013 so don’t mention you have to change compatibility level to 15, any less than 15 won’t work.



$dbManagedAccount = Get-SPManagedAccount -Identity domain\mimsp
New-SpWebApplication -Name "MIM Portal" -ApplicationPool "MIMAppPool" -ApplicationPoolAccount $dbManagedAccount -AuthenticationMethod "Kerberos" -Port 80 -URL http://mim.mimninja.com -DatabaseName SharePoint_WSS_Content -DatabaseServer fim-sql
$t = Get-SPWebTemplate -compatibilityLevel 15 -Identity "STS#1"
$w = Get-SPWebApplication http://mim.mimninja.com
New-SPSite -Url $w.Url -Template $t -OwnerAlias domain\fimsp -CompatibilityLevel 15 -Name "MIM Portal" -SecondaryOwnerAlias domain\fimsync
$s = SpSite($w.Url)
 
These commands are NOT needed as we are using the correct Compatibility Level.

$s.AllowSelfServiceUpgrade = $false
$s.CompatibilityLevel

Then continue as normal


$contentService = [Microsoft.SharePoint.Administration.SPWebService]::ContentService;
$contentService.ViewStateOnServer = $false;
$contentService.Update();

That’s it for just creating the sites there are a few more gotchas that I will put up soon…..

Logging for the FIM/MIM Web Services connector and config tool

There is so much information out there about setting up logging and none of it I could get to work, so to that end here is the definitive list of how to get logging working. Please note the difference between logging the Web Services Configuration tool and the Web Services connector itself….

To enable ETW logging for connector, please follow the below steps:

Case 1: When “Run this Management agent in a separate process” checkbox is checked.

Add the below section after the </configSections> tag in dllhost.exe.config file.
File Path: C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\dllhost.exe.config


&lt;system.diagnostics&gt;


    &lt;sources&gt;


        &lt;source name="ConnectorsLog" switchValue="Verbose"&gt;


            &lt;listeners&gt;


                &lt;add initializeData="ConnectorsLog" type="System.Diagnostics.EventLogTraceListener, System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" name="ConnectorsLogListener" traceOutputOptions="LogicalOperationStack, DateTime, Timestamp, Callstack" /&gt;


            &lt;/listeners&gt;


        &lt;/source&gt;


    &lt;/sources&gt;


&lt;/system.diagnostics&gt;



Case 2: When “Run this Management agent in a separate process” checkbox is not checked.

Add the below section inside the <system.diagnostics>/<sources> section in miiserver.exe.config file.
File Path: C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\Bin\miiserver.exe.config

Note: There are two <system.diagnostics> sections in miiserver.exe.config file. Please make sure to add the below section under <system.diagnostics> section which appears first.



<source name="ConnectorsLog" switchValue="Verbose">


    <listeners>


        <add initializeData="ConnectorsLog" type="System.Diagnostics.EventLogTraceListener, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" name="ConnectorsLogListener" traceOutputOptions="LogicalOperationStack, DateTime, Timestamp, Callstack" />


    </listeners>


</source>


To enable ETW logging for WS Config tool, please follow the below steps(new method):

Log level is resolved from the tool’s config file WSConfigTool.exe.config which is located under C:\Program Files\Microsoft Forefront Identity Manager\2010\Synchronization Service\UIShell\Web Service Configuration:

Initially after installing the tool the below section is commented out so user needs to uncomment this section as shown below:

 



<!--Uncomment system.diagnostics section to enable the event viewer logging for the WS Config tool, other listeners can also be added like TextWriterTraceListener, XmlWriterTraceListener etc.-->


<system.diagnostics>


  <sources>


    <source name="WSConfigToolLog" switchValue="Verbose">


      <listeners>


        <add initializeData="WSConfigToolLog" type="System.Diagnostics.EventLogTraceListener, System, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" name="WSConfigToolLogListener" traceOutputOptions="LogicalOperationStack, DateTime, Timestamp, Callstack, ProcessId, ThreadId" />


      </listeners>


    </source>


  </sources>


</system.diagnostics>

 

If you don’t see the section above in the WsConfgTool.exe.config file it means you are working with an older version of WS Connector so in this case please follow the steps below:

Web Service Configuration Tool Logging(old method)

By default, Web Service Configuration Tool logging is disabled. In order to turn ON logging, one should perform following operation:

1.     Open file FIM_INSTALL_DIR\Synchronization Service\ UIShell\Web Service Configuration\ WSConfigTool.exe.config

2.     Goto the “LoggingLevel” section and change the value to 2 or 3.

Logging level section:

<setting name=”LoggingLevel” serializeAs=”String”>

<value>0</value>

</setting>

3.     The different logging values represent the following:

a.     Value 2 – High logging – High important events (e.g. Exceptions) are logged.

b.    Value 3 – Verbose logging – All the activities performed are logged.

c.     Any other value than the above represents logging disabled.

4.     Save the changes.

Log file is written to folder:  C:\ProgramData\WebServiceConfigTool

Log file name: WebServiceConfigTool.log

After you have enabled the connector logging, please follow these steps :

  1. Clear the Application log.
  2. Creating a new connector Copy all the logs in a separate file Clear the logs.

 

If you get the error “The configuration section cannot contain a CDATA or text element” then try removing all the spaces in the xml pasted and re-insert them. White spaces from the web turn out to be not so white sometimes……

WordPress Appliance - Powered by TurnKey Linux