S4B Online PowerShell – Modern Auth

Another Microsoft API that I’ve come across needing a solution for Modern Auth.

This one’s a bit different. You don’t need to register an AD app, but you can connect to S4B with a token, and you can store the token in a Byte Array (in a file or database).

This method uses Microsoft.IdentityModel.Clients.ActiveDirectory.dll to acquire a token.

Wrote this function to handle the token stuff. And the function below to create the Token to store (as a byte array) and how to recall it to use it for unattended authentication.

Thanks again to Elliot Munro from GCITS for helping me figure this one out using a method we found for connecting to Microsoft InTune Graph API.

function Get-MSAuthToken 
{
    [cmdletbinding()]

    param
    (
        [Parameter(Mandatory=$true)]
        $User,

        [Parameter(Mandatory=$true)]
        $TenantId,

        [Parameter(Mandatory=$true)]
        $ClientId,

        [Parameter(Mandatory=$true)]
        $RedirectUri,

        [Parameter(Mandatory=$true)]
        $ResourceAppIdURI,

        [Parameter(Mandatory=$true)]
        $Authority,

        [Parameter(Mandatory=$false)]
        $StoredTokenByteArray,

        [Parameter(Mandatory=$false)]
        $ReturnTokenByteArray
    )
      
    Write-Host "Looking for AzureAD module..."
    $AadModule = Get-Module -Name "AzureAD" -ListAvailable
    if ($AadModule -eq $null) 
    {
        Write-Host "AzureAD PowerShell not found, look for AzureADPreview"
        $AadModule = Get-Module -Name "AzureADPreview" -ListAvailable
    }

    if ($AadModule -eq $null) 
    {
        throw "AzureAD Powershell module not installed..." 
    }

    # Getting path to ActiveDirectory Assemblies
    # If the module count is greater than 1 find the latest version
    if ($AadModule.count -gt 1)
    {
        $Latest_Version = ($AadModule | select version | Sort-Object)[-1]
        $aadModule = $AadModule | ? { $_.version -eq $Latest_Version.version }

        # Checking if there are multiple versions of the same module found
        if($AadModule.count -gt 1)
        {
            $aadModule = $AadModule | select -Unique
        }

        $adal = Join-Path $AadModule.ModuleBase "Microsoft.IdentityModel.Clients.ActiveDirectory.dll"
        $adalforms = Join-Path $AadModule.ModuleBase "Microsoft.IdentityModel.Clients.ActiveDirectory.Platform.dll"
    }
    else 
    {
        $adal = Join-Path $AadModule.ModuleBase "Microsoft.IdentityModel.Clients.ActiveDirectory.dll"
        $adalforms = Join-Path $AadModule.ModuleBase "Microsoft.IdentityModel.Clients.ActiveDirectory.Platform.dll"
    }

    [System.Reflection.Assembly]::LoadFrom($adal) | Out-Null
    [System.Reflection.Assembly]::LoadFrom($adalforms) | Out-Null

    try 
    {
        $authContext = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext" -ArgumentList $authority

        # https://msdn.microsoft.com/en-us/library/azure/microsoft.identitymodel.clients.activedirectory.promptbehavior.aspx
        # Change the prompt behaviour to force credentials each time: Auto, Always, Never, RefreshSession

        $platformParameters = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.PlatformParameters" -ArgumentList "Auto"

        $userId = New-Object "Microsoft.IdentityModel.Clients.ActiveDirectory.UserIdentifier" -ArgumentList ($User, "OptionalDisplayableId")

        if ($storedTokenByteArray -ne $null)
        {
            $authContext.TokenCache.Deserialize($storedTokenByteArray)
        }

        $authResult = $authContext.AcquireTokenAsync($resourceAppIdURI,$clientId,$redirectUri,$platformParameters,$userId).Result

        if($authResult.AccessToken)
        {
            if ($ReturnTokenByteArray)
            {
                $blobAuth = $authContext.TokenCache.Serialize();
                return $blobAuth
            }
            else
            {
                return $authResult
            }
        }
        else 
        {
            Write-Host
            Write-Host "Authorization Access Token is null, please re-run authentication..." -ForegroundColor Red
            Write-Host
            break
        }
    }
    catch 
    {
        write-host $_.Exception.Message -f Red
        write-host $_.Exception.ItemName -f Red
        write-host
        break
    }
} 

And the below function authenticates once (prompting you) and then writes the stored token byte array out to a file:

$clientId = "7716031e-6f8b-45a4-b82b-922b1af0fbb4" #S4B
$redirectUri = "https://adminau1.online.lync.com/OcsPowershellOAuth"
$resourceAppIdURI = "https://adminau1.online.lync.com/OcsPowershellOAuth"
$authority = "https://login.microsoftonline.com/common"
$user = $SessionInfo.Account.Id

#get S4B auth token in to Byte Array
$byteArrayToken = Get-MSAuthToken -User $user -TenantId $tenantId -ClientId $clientId -RedirectUri $redirectUri -ResourceAppIdURI $resourceAppIdURI -Authority $authority -ReturnTokenByteArray $true

#store byte array
$byteArrayToken | Out-File C:\bytes.txt

And this method reads the byte array back in, serializes it and then uses it to authenticate.

$byteArrayToken = Get-Content D:\bytes.txt
$user = "mysuer@tenant.onmicrosoft.com"

$clientId = "7716031e-6f8b-45a4-b82b-922b1af0fbb4" #S4B
$redirectUri = "https://adminau1.online.lync.com/OcsPowershellOAuth"
$resourceAppIdURI = "https://adminau1.online.lync.com/OcsPowershellOAuth"
$authority = "https://login.microsoftonline.com/common"

$s4bAuth = Get-MSAuthToken -User $user -TenantId $tenantId -ClientId $clientId -RedirectUri $redirectUri -ResourceAppIdURI $resourceAppIdURI -Authority $authority -StoredTokenByteArray $byteArrayToken

$secureToken = ConvertTo-SecureString $s4bAuth.AccessToken -AsPlainText -Force
New-CsOnlineSession -OAuthAccessToken $secureToken 

There may be another step in this – getting the right Lync Online URL for your S4B tenant. This can be obtained from the LyncDiscover process and I believe can be obtained through doing an HTTP Get on the LyncDiscover URL for your tenant.

ie: http://yourtenant.lyncdiscover.onmicrosoft.com.

I haven’t got to this LyncDiscover bit yet.. may need to create a function that does this to get the correct admin URL…

NinJam – Online Jamming Plugin for Reaper

At the beginning of Corona lockdown (here in NZ – 23rd March 2020), a mate found the NinJam plugin for Reaper. We all used Reaper anyway, so would be great if it worked…

Looked into it, sussed out that theres a little server app you run on your PC and open a port on your firewall (2049) to the machine that runs it, and tada.. the rest is History.

Our musical clan (Satellitas) racked up 18.5 hours of jamming recordings over the Lockdown period. NinJam is the business.

Check out these videos to get going:

Running a NinJAM Server – Note: only one person in your group has to do this, the others all just connect to you on your IPAddress on Port 2049.

 

This Video is setting up Reaper ready to Jam

Microsoft Secure App Authentication

The world of Microsoft Authentication is all changing with stricter AD policies been forced out like Multi Factor Authentication. That’s a good thing.

For people that do all their Microsoft services manipulation through the UI, that’s OK. The Microsoft Online sign in process is (after many years of being average), finally pretty solid and can handle multiple identifications at the same time logged in.

But in the PowerShell world, its all changed on how you can connect and run processes and scripts unattended. You can’t save passwords or be on the ready to accept an MFA prompt if you want a process to run periodically, or on demand through a provisioning system.

Microsoft’s Partner Center API is also the modern way you are given delegated access to modify settings and services for your CSP Customers.

The process these days appears to be to register an Azure AD Application in to the CSP tenant, and authenticate against this application for delegated access to your sub customers.

Simple commands like Get-MSOLUsers can be run in this context, but you specify -tenantID ‘your o365 tenant id’ to work in the context you need to.

These clever guys have some great blogs about how to register a Partner Center API Secure Authentication application and authenticate with a token so you can make a connection to Office 365, Azure AD, Exchange etc.

https://gcits.com/knowledge-base/how-to-connect-to-delegated-office-365-tenants-using-the-secure-app-model/

https://www.cyberdrain.com/using-the-secure-app-model-to-connect-to-microsoft-partner-resources/

https://www.cyberdrain.com/using-the-secure-application-model-with-partnercenter-2-0-for-office365/

https://www.cyberdrain.com/connect-to-exchange-online-automated-when-mfa-is-enabled-using-the-secureapp-model/

These methods work if you are a Microsoft Partner and have a Partner Center where you manage your customers.

But, if you want to run your PowerShell directly against an Office 365 tenant that you don’t have delegated access to, that’s a different process.

I looked for ages to try and find some simple example that would allow me to register a Secure App in the customer’s tenant I want to connect to and manage and then use this app to authenticate from PowerShell with no MFA, passwords etc.

I ended up working with Elliot Munro from GCITS from the first link above, and with his clever reverse engineering skills, we figured out the below script, that’s essentially an adaption of the Partner Center Secure App script from the above examples, but it targets the end-customer’s tenant instead.

This is the code to create the Secure App in the Tenant.
Note, you still need the Partner Center API to do this, even though you won’t be making a connection to Partner Center. This gives your scripted login the rights to do what it needs to do.

Make sure you have these modules installed:

Install-Module PartnerCenter
Install-Module MSOnline
Install-Module AzureAD

Code to create Secure App:

$adAppAccess = [Microsoft.Open.AzureAD.Model.RequiredResourceAccess]@{
    ResourceAppId = "00000002-0000-0000-c000-000000000000";
    ResourceAccess =
   [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
        Id = "5778995a-e1bf-45b8-affa-663a9f3f4d04";
        Type = "Role"},
    [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
        Id = "a42657d6-7f20-40e3-b6f0-cee03008a62a";
        Type = "Scope"},
    [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
        Id = "311a71cc-e848-46a1-bdf8-97ff7156d8e6"; 
        Type = "Scope"}
}

$graphAppAccess = [Microsoft.Open.AzureAD.Model.RequiredResourceAccess]@{
    ResourceAppId = "00000003-0000-0000-c000-000000000000";
    ResourceAccess =
        [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
            Id = "bf394140-e372-4bf9-a898-299cfc7564e5";
            Type = "Role"},
        [Microsoft.Open.AzureAD.Model.ResourceAccess]@{
            Id = "7ab1d382-f21e-4acd-a863-ba3e13f7da61";
            Type = "Role"}
}


$SessionInfo = Connect-AzureAD

$DisplayName = "Test Auth"

$app = New-AzureADApplication -AvailableToOtherTenants $true -DisplayName $DisplayName -IdentifierUris "https://$($SessionInfo.TenantDomain)/$((New-Guid).ToString())" -RequiredResourceAccess $adAppAccess, $graphAppAccess -ReplyUrls @("urn:ietf:wg:oauth:2.0:oob","https://localhost","http://localhost","http://localhost:8400")
$password = New-AzureADApplicationPasswordCredential -ObjectId $app.ObjectId
$spn = New-AzureADServicePrincipal -AppId $app.AppId -DisplayName $DisplayName

$PasswordToSecureString = $password.value | ConvertTo-SecureString -asPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($($app.AppId),$PasswordToSecureString)

$token = New-PartnerAccessToken -ApplicationId $app.AppId -Scopes 'Directory.AccessAsUser.All offline_access openid profile User.Read' -ServicePrincipal -Credential $credential -Tenant $spn.AppOwnerTenantID -UseAuthorizationCode

Write-Host "ApplicationId       = $($app.AppId)"
Write-Host "ApplicationSecret   = $($password.Value)"
Write-Host "TenantId            = $($SessionInfo.TenantId)"
Write-Host "Refresh Token       = $($token.RefreshToken)" 

When you run the above, a browser window will pop up and get you to authenticate with a user that has the rights to create the Secure App in that O365 tenant.

This will output in the PowerShell window App Id, App Secret and Refresh Token. These are what you use to authenticate in your PowerShell scripts when you need to.

NOTE: With all of this, you’re going to be dealing with Security tokens that have some power. Make sure to save them appropriately in something like the Azure Key Vault.

The code you put in your PowerShell scripts to authenticate is the same as the example links above connecting to Partner Center.

$refreshToken = "<refresh_token>"
$app_id = "<app_id>"
$app_secret = "<app_secret>"
$tenantId = "<tenant_id>"

function Get-GCITSAccessTokenByResource($AppCredential, $tenantid, $Resource) {
    $authority = "https://login.microsoftonline.com/$tenantid"
    $tokenEndpointUri = "$authority/oauth2/token"
    $content = @{
        grant_type = "refresh_token"
        client_id = $appCredential.appID
        client_secret = $appCredential.secret
        resource = $resource
        refresh_token = $appCredential.refreshToken
    }
    $tokenEndpointUri = "$authority/oauth2/token"

    $response = Invoke-RestMethod -Uri $tokenEndpointUri -Body $content -Method Post -UseBasicParsing
    $access_token = $response.access_token
    return $access_token
}

$AppCredential = @{
            appId        = $app_id
            secret       = $app_secret
            refreshToken = $refreshToken
        }
  
    
try
{
    $MSGraphToken  = Get-GCITSAccessTokenByResource -Resource "https://graph.microsoft.com" -tenantid $tenantId -AppCredential $AppCredential
    $AadGraphToken = Get-GCITSAccessTokenByResource -Resource "https://graph.windows.net" -tenantid $tenantId -AppCredential $AppCredential
}
catch
{
    $errorMessage = $_.Exception.Message
    throw "Error refreshing tokens - $($errorMessage)" 
}

try
{
    #Connect Office 365
    Connect-MsolService -MsGraphAccessToken $MSGraphToken -AdGraphAccessToken $AadGraphToken 

    #Connect Azure AD
    Connect-AzureAD -MsAccessToken $MSGraphToken -AadAccessToken $AadGraphToken -AccountId $tenantId -TenantId $cspAuth_tenantId

    return $true
}
catch
{
    $errorMessage = $_.Exception.Message
    throw "Error connecting to MSOL - $($errorMessage)" 
}  

Kelvin Tegelaar from the article on connecting to Exchange also demonstrates how you can connect without Partner Center to a ‘well known’ app to authenticate the modern way with Exchange.

Note in his article, the Non-Partner Center way to authenticate and give consent and then use this to connect to Exchange.

https://www.cyberdrain.com/automating-with-powershell-using-the-secure-application-model-updates/

Working next on how to connect to the Skype for Business Online module using this similar method. Will post an update when I figure it out..

Virtualizing your PC so you can rebuild it (but still have a running copy of old setup).

I have been in several cases where i really want to rebuild my laptop (because its a complete mess), but don’t want to lose the way its setup.

I do a bit of development with Visual Studio, and invariably find I’ve had to download all sorts of SDKs and run-times and things to get projects working. I just know that if i copy the code to another build of Windows and open and run, there will be all sorts of things it relies on that i have completely forgotten i had to install to get working in the first place.

There’s also a number of other reasons i may want to go back to my pre-rebuild laptop setup if i cant get something working on the new build, or if i cant find something.

So, I decided to see if i could ‘virtualize’ (P2V) my current laptop setup and run it on Hyper-V if i need to get in to it (on top of my newly built Win 10). Run the VM from an external HDD.. yes not the fastest, but there if i need to gun up things as they were on my previous messy Windows setup and get what i need..

So i started looking for an up to date guide. I found this one on Veeams site. Its really good, and it works.

Thanks to whoever wrote the process.

Also needed to get my data drive of (250gb) over to the new image too so when i start the VM up, it has its E: as it was before so things all work. I followed this guide to create the E drive VHDX file so i could attach it. Its also a good time to get all my data and have another copy of it in this VHDX file.

https://www.windowscentral.com/how-create-and-set-vhdx-or-vhd-windows-10#create_vhdx_windows10

Here is the process i went through to do this successfully.

Downloaded Disk2VHD
https://docs.microsoft.com/en-us/sysinternals/downloads/disk2vhd

Extracted and Ran. Un-ticked the D drive (HP Recovery) and E: Drive (all my data to make the image as small as i could).
Plugged in an external drive (F) and created the VHDX of my bootable laptop on there.
Used these settings:

The cool thing is you can keep using Windows while it creates the image. I definitely wouldn’t be creating any new docs during the process, but i could still browse the web etc. Process took approx 2 hours and created a 194gb VHDX file (the C drive was using around 200 or 230gb).

II then repeated the process and chose just my data drive and created another VHDX file (E_Drive) – took another hour or so for 200gb ish.

On a completely different PC running Hyper-V, i plugged in the drive with my 2 VHDX files of my C Drive (and system resevered etc) and my E Drive.

I created a new VM on that machine. I browsed and attached the C drive and E drive, gave the box 8gb RAM and 4 cores and turned it on. Boom – theres my laptop running in a VM.

First time starting up, Windows took a while to rejig things and i saw this come up, but eventually i was presented with desktop to login.

I had the luxury of testing it all worked on another machine before going through the process of blitzing my laptop and starting again with a clean Windows Installation. STRONGLY suggest you do the same and not just assume this process will work for you. Once you nuke your laptop, its too late to go back.

I plan to have the VM setup in Hyper-V on the new Win 10 build so i can just plug in HDD and power on if i need it, but i have it working on this other Hyper-V server i have worst case scenario.

Platform.IO, Visual Studio Code and Arduino – Bye Bye Arduino IDE

Was looking around at some code examples and I stumbled on to someones comments around Platform.IO as a replacement to the Arduino IDE.  I really don’t like the Arduino IDE so was keen to have a look at it.  I tried Visual Micro (allows Arduino development in Visual Studio) once, but found it a bit buggy and slow.  It may be fine these days?

Any way, i though’t i’d put in the time and learn Platform.IO, and I am very glad i did!.  It took a little bit of messing around but it was worth it.  Never going back to Arduino IDE now I am setup 🙂

Here’s the process I went through to get going on Windows 10:

Installation

Download Visual Studio Code – Its only 43mb – i was thinking Visual Studio, It will take 2 hours to install, but Visual Studio Code is really lightweight.

Run Visual Studio as Admin (right click, run as Admin – needed to do the next bit).

I changed the color scheme to light as I am old school 🙂   – File > Preferences > Color Scheme

Install Platform IO extension in to Visual Studio code – follow these instructions here (this is why running as Admin as it failed for me the first time without running as Admin)

 

All going well, you should be ready to go and when you open Visual Studio Code, you should see the PlatformIO home screen, and a little Home icon on the bottom left of the bottom status bar.

 

Create a Project

Click on + New Project

Give it a name

Select a board – you can search here.. this is what i really liked, heaps of support for modern boards.  I initially tried the WEMOS LOLIN board – ESP32 with an OLED Screen on it.  Have also tried with just basic NodeMCU 1.0 boards too.

For this example, using a Wemos D1 Mini.

Set Arduino as the platform.

Press Finish

Once you press Finish, if this is your first project, it can take some time to open and install all the libraries and toolchains etc.

I thought mine had locked up, but after maybe 7-8 mins, it kicked into life.  Now every time i start a new project it’s fast.

Had the same thing when i started a new ESP32 project – it has to go and get all the bits and bobs to make it go so it’s slow the first time.

Moral – let it do its thing.

 

Getting your head around it

Here’s my notes from my initial – getting my head around it and how it works like Arduino IDE.

First off, the main ‘sketch’ is under the src > main.cpp file

We can get something really basic happening here just writing to the terminal.  Use this code:

#include <Arduino.h>

void setup() 
{    
    Serial.begin(115200);
}

void loop() 
{    
     Serial.println("Hello Universe");
     delay(1000);
}

Now before we upload our basic sketch, configure the board port and speed.

Open the platformio.ini file from the left hand nav at the bottom

If you need to find what port your board is connected on, check under Device Manager in Windows:

Start > type Device Manager

Expand out Ports (COM & LPT)

Once you have set your correct port in the INI file, you can build and upload.

The bottom nav bar is where you find the build, upload and show terminal buttons, similar to Arduino IDE.

The tick = build, the right arrow = upload, and the plug icon = show terminal window.

Press the tick button to build.

You should now have a compiled version of our simple sketch.  If you have any error, you will see in the Terminal window what’s wrong.  Hopefully you should have no errors with basic sketch above.

Now upload (presuming plugged in) your sketch the the board using the right arrow on bottom Nav.

 

Once you are uploaded, you can then switch on the terminal monitor by pressing the plug icon in the bottom nav:

Here you can see our simple app displaying ‘Hello Universe’ in a loop every second:

Yay, that’s the basics of Platform IO… now for the next bit.. handling libraries.

 

Library Management

This is the bit i like about Platform IO, took my head a bit to get.

First off, there’s a library manager system built in like the later versions of Arduino IDE.

Let’s say we want to add Blynk support.

Switch the PIO Home tab, and click on the Libraries button on the left hand side.

Here you can search for a library.

Click in to the Library to read more, get examples, and Install.

Once the library is installed, it will be put in to your file system here:

This is the folder where libraries are stored.

You should be able to then just add a reference to your library you added:

I found though when you include manual libraries, sometimes they get a green squiggle under them.

In this case, you have to go add a reference to the libraries path.

When you see a green squiggle under the #include line, go check the c_cpp_properties.json file in your project.

Make sure the path to the library you want to use is in this file in the top section.

NOTE: the folder path is separated by forward slashes instead of back slashes.. you need to follow this format.

In this example below, i can see the Blynk library was automatically added when we added the library through the library manager search/install process.

 

If i wanted to download a manual library from the internet and use it in my sketch, i would follow these steps:

For example, lets say i want to use this WifiManager library that has been modified to support ESP32 and isn’t necessarily available in the Library Manager.

I’d download the ZIP file from here: https://github.com/zhouhan0126/WIFIMANAGER-ESP32

I’d right click and unblock the ZIP, then i’d extract it to here:

C:\Users\paul.obrien\.platformio\lib\WIFIMANAGER-ESP32-master

I’d go and insert a reference line in the bottom of the includePath section of my c_cpp_properties.json file (changing the \’s for /’s) to the path of the library.

Now i’d add my #include as normal to my sketch and build.

 

Hope that helps someone get started quickly.

The intellisense, problems tab (so you can immediately see issues), and general workflow is so much nicer than the Arudino IDE.

Plastic Micky – My IoT Robot to entertain the toddler

My 2 1/2 year old is robot mad.  I played him an old youtube clip of Metal Micky from the early 1980’s that was on TV when I was a kid and he has been an addict ever since.  There’s some more youtube vids of a guy in the USA retrofitting OnmniBot (again from the 80’s) with modern tech as well and he watches it over and over..

So I thought, why not build him his own one.  He’s fully into the parts and ‘how does it work’ buzz and fizzes when the AliExpress parcels turn up from China with servo’s and bits and pieces.

I wanted the thing to be a similar size to him, be able to move around, wave its arms, move its head left and right and control various LED lights and strips.  A project we can build up to do different things and tinker around with on Sunday afternoons.

Key things to sus:

  • Body – what to use? (plastic rubbish bins :))
  • Head – what to use – how to make move left and right (servo, bracket etc.)
  • Movement – Tank Tracks on the bottom for movement – Ultrasonic sensor so it can avoid stuff – like Roomba 🙂
  • Lights /Buttons to play with
  • Camera/Screen – got an old android tablet with forward facing camera to stick on the front.
  • Brain – ESP8266 Wifi Micro Controller connected to motors/servo’s/relays to control everything
  • Control – C++ Code on the ESP8266 (Arduino style) and something like Blynk for mobile phone control
  • Semi resemble Metal Micky – we call him ‘Plastic Micky’.

 

So this post is going to be the process of building it up, physically, sourcing all the bits and the electronics (as simply and cheaply as possible) to bring Micky to life..

I aim to build up the electronics using simple cheap readily available parts.. and maybe when the design is ‘stable’ – make in to a single PCB – maybe others can contribute to the design?? I’m going ‘open source’ on the electronics design and C++ code.

Here’s a link to GitHub project for the source code.. https://github.com/paulobriennz/plasticmicky

 

I’m just a beginner really with C++ and micro controllers, but I’ll have a crack.. if any one wants to make the code better – please feel free 🙂  I was even thinking, this could be a project to build in school’s for kids to learn electronics.. there’s ton’s of ‘instructable’ type articles around on the net about how to use an Arduino to make some wheels move, or lights flash, or a servo do things but not really any one stop guide that combines all the components to build a walking talking robot with mobile control and a bit of intelligence to combine movement with ‘personality’.

Also going to build an Alexa skill.. so Micky can do things by voice command as well as mobile phone control.. I’ll post the code to this as well.

 

High level, I’ve come up with an electronics design based trying a few different bits and pieces – :

  • 12v 7A Alarm Battery for Power Source (need to source a battery charging module)
  • NodeMCU/Wemos Development board – ESP8266 based – easy to ‘swap out’ and try different firmware’s etc
  • L298N Motor Controller (2A motor rating) to drive the tracks for movement (initially tried a ESP12E Motor Controller board (600ma motor rating) but not enough guts to handle robot’s weight – chip kept overheating).
  • PCA9685 16 Channel I2c Servo Control Board to move heads, arms, ears, mouth etc..
  • PCF8574 8 Channel I2c GPIO Extender – as we’re going to need a few pins to make this all happen – 6 alone for the L298N Motor Controller
  • Adjustable Buck Regulators to get 5V for Relays and electronics, LCD etc., 6-7v for Solenoids – 12v for Motors, LED Strips etc
  • OLED I2c Screen to help ‘communicate’ what’s going on.
  • Tank Tracks for movement – brought a kit with small ESP12E Motor Shield and NodeMCU – but the chip on the motor driver board is pretty gutless and just overheated trying to drive any weight on the tracks – hence the L298N controller above.  Motors connected to 12v battery directly and it was grunty as so the 12v motors and track mechanism have def got enough guts to move the thing.
  • Pan and Tilt Servo bracket for head – just using a Servo to spin the head left and right (although this area of the design needs more (mechanical) work..

 

Plan is to be able to control via Blynk (or some IoT control app) to move about and make head move, control lights etc. and have an autonomous mode (like a Roomba) that can move around and use its ultrasonic sensor (radar) to avoid objects and change course.  Also expose a ‘web service’ so we can create an Alexa skill to get it to do things.  Simple Wi-Fi Setup so you don’t have to mess round with C++ code to get it to connect – plug and play!

 

Procurement

First mission – what is the body and head going to be… after watching Metal Micky on YouTube for the 407th time with toddler, I thought – rubbish bins like you see in cafeteria’s.

I went to the Warehouse (general everything store we have in NZ – like Home Depot) and got 2 rubbish bins.

1 is a mini ‘wheelie’ bin and the other was a round bin with a push flap

 

https://www.thewarehouse.co.nz/p/living-co-wheelie-bin-black-60l/R2120555.html#start=1

https://www.thewarehouse.co.nz/p/taurus-rad-bin-50l-assorted/R638533.html#start=1

I’ll use the top off the round one for Micky’s head, and turn the wheelie bin upside down, remove the lid and casters and make a wooden base for the wheelie bin to sit on upside down.

This base can have the tracks screwed to it so its relatively ‘stable’.

 

NodeMCU Pin Map (and what ones you can actually use for stuff)

I found this picture on the internet, and it’s gold.. thank you to whoever made it – I always struggle with which pins you can actually use for stuff, and which pins will stop the thing from booting or flashing if you try and use or do weird stuff (like SD2)

For translation – Pins D10 and D9 are RX/TX for the onboard USB Serial port – so if you are using the USB Serial, you can’t use these pins

D1/D2 – normally I2c (GPIO 4,5) – tested this as well with a Servo Controller Board and I2C Scan tool

So D3 and D4 can only be used for Digital Write’s

D10 and D9 are RX/TX on the USB Serial – so if you want to debug in console, you can’t really use these easily

D1 and D2 work nice for I2c but also work for Digital Read or Write

SD3 works as GPIO10

SD2 is evidently no go – I did an I2c test on GPIO 9 (SD2) and 10 (SD3) and the I2c Scan found the device connected, but trying to use just made the NodeMCU freak and reboot

D5,D6,D7,D8 are all fully usable read and write

D0 – messes with booting up/flashing so I generally stay clear of it.

A0 is Analog Pin

Dynamics 365 – Connecting your app using MFA

MFA is becoming a common thing, as joyous as it is to use 🙂

I have a C# app that connects to Dynamics CRM/365 and I had to update it to support Microsoft Azure MFA.

I couldn’t really find any definitive guide out there, i had to cobble all different things together to get a working solution.

I hope this guide helps out some other poor sucker like me.

 

1. I had to update my application to use the modern CRM Tooling method of connection.

I added the following in Nuget to my solution – the key being the CrmTooling which supports the new connection string method of connecting.

In my code, i changed the way i obtained an IOrganizationService to the below (simplified):

string conn = "my connection string";

IOrganizationService _crmService;

CrmServiceClient service = new CrmServiceClient(conn);

_crmService = (IOrganizationService)service.OrganizationWebProxyClient != null ? (IOrganizationService)service.OrganizationWebProxyClient : (IOrganizationService)service.OrganizationServiceProxy;

 

This gets me a connection using the new Tooling DLL and a CRM Connection string.

 

Next step, you need to create an application in Azure AD.  I followed this guide.

The trick is the Redirect URI – i wasn’t working with a web app – I ended up using http://localhost

https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/walkthrough-register-dynamics-365-app-azure-active-directory

 

Finally, constructing a connection string that would work with the new registered Azure App.

AuthType=OAuth;Url=https://yourcrm.crm.dynamics.com;AppId=yournewappid;RedirectUri=http://localhost;

 

Now when you go to connect, the Microsoft Sign In assistant pops up and handles the authentication to the CRM Instance.

And, if you have MFA turned on, you are also prompted with MFA.

 

Happy Days!

 

Dynamics CRM 365 – On Prem – Invalid Trace Directory

Looks like another piece of CRM team awesomeness.

The Tracing directory should be:

C:\Program Files\Microsoft Dynamics CRM\Trace

 

But some update somewhere changes it to:

c:\crmdrop\logs

 

That’s not very helpful.

I initially tried to change the trace directory back to the right place using CRM PowerShell, but that failed with authentication errors (that i have also posted on here – http://paulobrien.co.nz/2018/03/07/get-crmsetting-powershell-the-caller-was-not-authenticated-by-the-service-the-request-for-security-token-could-not-be-satisfied-because-authentication-failed/).

 

This is the guide i tried using the powershell method – makes sense, if powershell crm wasn’t broken as well.

How to fix ‘Invalid Trace Directory’ errors

 

So ended up changing in the CRM Database and registry:

 

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSCRM]
"TraceDirectory"="C:\\Program Files\\Microsoft Dynamics CRM\\Trace"
"TraceEnabled"=dword:00000001

 

And in the MSCRM_CONFIG database:

SELECT NVarCharColumn
  FROM [MSCRM_CONFIG].[dbo].[ServerSettingsProperties]
  where ColumnName = 'TraceDirectory'

  update [MSCRM_CONFIG].[dbo].[ServerSettingsProperties]
  set NVarCharColumn = 'C:\Program Files\Microsoft Dynamics CRM\Trace'
  where ColumnName = 'TraceDirectory'