Allowing TEAMS Live Event Guests to join anonymous by default

When I started producing a large amount of public facing Teams Live Events, I consistently had feedback that joining the event was too confusing.

I agree, the default Microsoft Teams join experience for people that don’t already use Teams can be a little convoluted, especially if you’re not particularly tech savvy in the first instance.
You are prompted as a default to download the Teams client on the desktop and then the big buttons in front of you prompt you to sign in or create an account. The join as a guest or anonymous participant is secondary and often missed.
This is also the case on the mobile where the link to join the meeting will prompt to download the app but then most people open the app which in turn prompts them to sign in or create an account.

We really wanted to streamline things and lower the barrier to entry for our outside attendees, and so we gave this feedback to our Microsoft contacts. They acknowledged these challenges and gave me this little trick to ease the experience. I have not seen this documented anywhere (maybe I missed it) so I thought I’d write it up here.

The trick

By making some tweaks to the event URL, on a desktop, the event will automatically load in the browser (no software download prompt) and join anonymously; On the Mobile, it will automatically join anonymously.

To make this work, you will end up with two URL, one for the desktop and one for mobile. The reason is that we want desktop users to launch in a browser and not attempt to open an app.

TIP: Put both finished URL behind a link shortener. I found some email mailing platforms will invalidate the links. You also get the added protection of being able to swap the links if you make a mistake or the event link needs to change without having to resend it to your participants.

Creating the links

On the desktop, we are going to get the link to automatically join the event in the browser and as a guest or anonymous participant.

Take your event url and put it in your text editor of choice.

The first thing you need to search for and replace in the URL are three HTML URL encoding references.

Replace %3a with :
Replace %40 with @
Replace %2c with ,

Next, append (put at the end) the below string that will automatically join the event as an anonymous participant.


You have now created a URL that can be used on a mobile device.

In order to create the new URL for the desktop, you need to make one further modification.

The URL begins with
Change this to insert /_# between and l/

Below is an image with an example of an original URL modified to Mobile & Desktop. I’ve marked the changes out in colour to hopefully make it clear. Due to the atrociously long URL, I have deliberately put line breaks in the URL for clarity. Click to enlarge the image.

Making the process easy with Power Automate

If the above seems a little fiddly, then I agree. So I made a Power Automate Flow to make it a quick and accurate process.

Contact me if you’d like a copy.

Hopefully this trick makes your Live Events a little more accessible to everyone.

Recording Teams or Skype meetings with NDI

With a standard Teams and Skype meeting recording you can’t control what is being recording in terms of which meeting participants are on screen. In Teams, even with the new Spotlight feature, the recordings are not controllable.

Ever had the need to record meetings with more control?
Now you can as Microsoft Teams now includes support for a network video protocol called NDI. Skype consumer has had this support for a while which is why Skype is often used in live broadcast to bring remote guests into interviews with a Skype TX based system.

NDI (Network Device Interface) is a standard that was developed by a company called NewTek primarily designed to enable video products to communicate over a computer network with low-latency.

We recently used this function with our Recording & Broadcast team to create a 3 hour event during the COVID lockdown without anyone needing to be in the same room with a camera.

Things you will need

In order to use NDI in Teams, your Teams Administrator will have to have enabled the feature in your tenant within the meeting policy you are assigned to.

Only users within your Teams tenant will be able to using this function so you don’t need to worry about guests and other attendees being able to record.

The only thing you will now need, in addition to the Teams client (Windows only at this time), is a tool that can read the NDI feed. People that are used to working with live broadcast tools like OBS or XSplit can use those, but there is an easier way, especially if you need to record multiple presenters from the meeting separately.

NewTek provides a free NDI Tools kit which has everything you need and will run on both MAC or PC.

NDI Tools PC tools

Enable NDI on the Teams client

In your Teams client, you need to turn on the NDI feature in the settings BEFORE you join the meeting or you won’t see the meeting option to enable NDI Broadcast (New in November 2020).

You will find this option in Settings -> Permissions -> Network Device Interface (NDI)

If you are using Skype, make sure you have a current version and look in the Settings -> Calling -> Advanced area to turn NDI on.

Turn on Broadcast over NDI in the Teams Meeting

In the November client update, you now enable NDI per meeting from the ellipsis menu.

Using the NDI Tools Studio Monitor or Video Monitor

I’m using a PC so I will use the NDI Tools Studio Monitor.

Right click in the Studio Monitor window and you should see the name of your computer and the different presenters in your meeting that you can choose from.

In my demo meeting, I have four people. You will always appear as MS Teams – (Local). Note that you also have the option of choosing the Active Speaker so you always capture the person speaking.
Large Gallery and Together Mode will also be available when there are enough people in the meeting along with content when it is being shared into the meeting.

When you are ready to record, just hit the red record dot in the bottom left corner.

If you need to record other presenters at the same time, just look in the menu under Settings -> Application -> New Monitor… to open another Studio Monitor window, pick a different presenter and hit record.

Things to note

I haven’t had the need to record more that 5 presenters at a time at the moment but if you are looking to record a lot, it will be worth looking at the Livemind Recorder tool which can simultaneously record up to 16 NDI feeds in a single interface.

Any modern PC or MAC should be able to record NDI streams.
For the best performance you are going to need to be recording to an SSD as each feed could be writing 2mb/s to 10mb/s to disk depending on the quality of the feed you are receiving from Teams.
Enable hardware acceleration on each Studio Monitor window by going to Settings -> Video -> Allow HW Acceleration.

Videos will be recording into a MOV file using the NewTek SpeedHQ (SHQ2) codec which is designed to have very low CPU overhead and used as an intermediary codec. VLC and other programs will support playback as the NDI Tools package installs the required codec. Handbrake has no problem converting these files into an MP4 if required.
You will also notice a .NDI file written alongside it. If you use Adobe Premiere (and have the NDI Tools installed), this file provides timecode information making syncing up multiple recordings easy.

The resolution of what you record from each feed is governed by the Teams client and works exactly how it does in the Teams world. It will be a reflection of the feed that the Teams client is receiving.
This starts with the quality of the picture that the presenter is sending and ends with how good your internet bandwidth is and performance of the device you are using.
It is possible to record a clean 720p (or 1080p when Teams reinstates that) picture if the presenter is sending that resolution and your client is receiving it.

The other thing that governs the resolution is the media bit rate setting in the Teams Meetings Policies as set by your Teams Administrator.
You may need to get your Teams Administrator to check the media bit rate settings of the Meetings Policy if you are expecting to be doing a lot of recording. It will need to be set to at least 10,000 KBs if you’re looking to capture multiple presenters in the same meeting at 720p or above.

Hope this helps you create great content from Teams Meetings and Events.

OneDrive B2B sync

Microsoft released new functionality in the OneDrive sync client to now let users sync libraries or folders in SharePoint or OneDrive that have been shared from other organizations.

They have a DOC article up about this B2B Sync capability and list it as still in preview as of the date of this post.

To get it to work right now you will need to ensure you have set your OneDrive client to join the Insider preview ring.

To do this in Windows, right click on your OneDrive icon and choose Settings. Go to the About tab and tick the box to join the Insider ring.

Upon pressing OK, the OneDrive client will restart and install the latest Insider ring release.

Ensure you restart your computer. Only then will B2B sync work.

Microsoft Flow – Tweet today’s events from an RSS feed

I started a little community project a few years ago called
Melbourne Theatre Calendar with the aim of listing all of the theatre shows in Melbourne in a single place. It came out of frustration that when I wanted to go see a show I couldn’t easily see what was playing ‘today’ or ‘tomorrow’.

Now that the site has a lot of content, I want to expand how I can share the information starting with Twitter. The site is built upon WordPress with The Events Calendar plugin and that plugin provides an RSS feed of the next 10 events.

After playing around with some online services that can Tweet from an RSS feed, I decided I could get a better outcome for free by using Microsoft Flow.

My goal was pretty simple, at 8am every day (local time), tweet all today’s events as individual tweets from the RSS feed. No events today, no tweet.
One small challenge is that the RSS feed contains the next 10 events regardless of event date so this needed to be accommodated.

A high level view of the resulting flow looks like Recurrence, List all RSS feed items and Apply to each.

In more detail, it it’s built as follows:

Recurrence, this is the easy part. Interval of 1 Day, running at 8am.

List all RSS feed items, also pretty straight forward. We can’t use the since option as the RSS feed could also be showing future events so we’ll do the filtering in the next step

The Apply to each action step is where things get more interesting and for me, actually started with the Condition action.

The finished result is pictured below but I’ll walk you through how it was created.

Choosing a new action below the RSS feed, I selected the Condition Control action.

You are then presented the condition statement area and If yes and If no sides of the condition.

The condition statement is just a true or false, yes or no test.
If something IS something where the IS could be equal to, greater than, contains, does not contain and so forth.
For me I wanted the statement along the lines of [date of RSS feed item] [is equal to] [todays date].

Starting on the left side of the statement, I chose the dynamic field from the RSS feed called Feed published on.
When you do this, Flow automatically wraps an Apply to each action around the condition as it reccognises that an RSS feed has multiple items. It puts the dynamic field Body in in the select and output from the previous steps box for you too.

The test is is equal to.

The right hand side of the statement was more of a challenge to figure out and I turned to the Microsoft Flow Community for some suggestions after hitting a few brick walls.

As my test is related to the date only, I needed to first convert the Feed published on value to a date only format.
This is achieved using the formatDateTime function with the yyyy-MM-dd format string.
The end result on the left side of the statement therfore became:


With the feed item date now on the left, I needed to get today’s date to compare it to.

It was suggested I use the utcNow function but I found this wouldn’t work due to timezone differences. As I was running this flow at 8am UTC+10 this would return a value of 22:00 UTC the day before and never match the UTC date values of events that were afternoon and evening based.

Enter the getFutureTime function. By using this to add 4 hours to the returned value, I could ensure that converted date would always be the same date as the event. It has it’s own date format string too!


Now onto the the actions. I first built this Flow with the actions outputting to email. I reccomend this whilst you’re nutting out problems with your Flow as you can use obvious text to indicate whether the email was the result of the statement being true or false.

The If no was always going to be blank as if the feed item wasn’t for an event today, I didn’t want it tweeted.

The If yes was the Post a tweet action. As the RSS events were is UTC, I needed to convert them to the correct timezone with the convertFromUtc function. With this you set the timezone and the display format.

convertFromUtc(items(‘Apply_to_each’)?[‘publishDate’],’AUS Eastern Standard Time’,’f’)

The finished Flow looks as below.

Onsite wifi shooting with multiple Canon EOS 6D

One of my private clients is a photographer and he’s done a pretty good job at trying to keep pace with technology.
He made the jump to digital pretty early and worked through the issues with colour and digital print quality and has installed his own digital photo lab.
As camera megapixels’ increase, storage and file processing speed challenges start to crop up and a small business starts needing serious technology such as storage systems and 10gbit ethernet.

His latest challenge to me was changing the way he shot his major yearly dance event.
In the past photos were taken and then staff would visit dance schools a few weeks later after the photos had been processed and do the selling.
The new approach this year was to shoot, retouch, print and sell on the same day at the venue to capitalise on the impulse buy. Simple. Yep. Kind of.

Enter the equipment.
FujiFilm Frontier-S DX100 printer
Canon EOS 6D cameras
Adobe Lightroom and Photoshop
Canon EOS Utility
Two PCs
Wireless router

shooting setup

The FujiFilm DX100 is a stunning little 6 colour inkjet photolab. Inkjet you say, humf. Well this thing prints as good as it’s full sized (small car sized) FujiFilm brother. Now when I say little it’s twice the size of your average inkjet printer.
The cost is about 40c per print compared to 4c off the big brother.
My only gripe about this unit is the lack of onboard ethernet. It’s USB only which means you need to resort to windows printer sharing.


The Canon EOS 6D are his current stock camera and the requirement was for two to be able to shoot at the same time (he has a light and a dark background) and save directly to the computer. For this job we setup four camera bodies with WiFi shooting but also saving to card as a safety.
So this is where we found the first limitation. The Canon EOS Utility only allows you to pair a single camera at a time so in order to shoot with to cameras we had to have two different computers running the EOS Utility.
Canon WT.? I could not find any 3rd party software or Canon solution to this limitation. There is software out there but it is geared to remote triggering multiple cameras at the same time, not what we’re after here.
What we really need is a camera server edition which it’s only job is to receive photos from multiple cameras as they are taken.

Lightroom and Photoshop are the final pieces of the solution.
The original plan was to have a single computer renaming, processing and printing all the images. We originally set both EOS Utilities to save their files to the same location (via the network). The main reason for this was Lightroom can only have one auto import folder.
Due to the pace of the event, a single processing workstation couldn’t keep up with pace so we swapped out the simple laptop that was running the second EOS Utility and bought in another workstation.
Another frustration on mine is Lightroom’s lack of support for a shared catalogue. We have the same problem at his studio where he has 4 processing workstations. An image processed on one workstation means nothing to another unless the final product is exported.
I guess the same could be said with Lightroom and Photoshop. As Photoshop knows nothing of Lightroom edits, you need to export to then run Photoshop actions that Lightroom doesn’t have.

The final outcome is each backdrop (light or dark) has it’s own processing workflow.
Camera -> EOS Utility -> Lightroom crop and renumbering then EXPORTED) -> Photoshop actions then print from Photoshop.

In fact for the remainder of this job we are also adding a second printer so each backdrop is now truly independent.

TIP: On two of our computers, Norton firewall software interfered with the Canon EOS Utility network communications. Even after disabling it we had to completely re-pair the cameras and sometimes again when turning the computer on the next day.
Next year we will completely remove security software prior to this job and reinstall afterwards. Canon might need to look at improving the robustness of the software as there should not have been a need to completely re-pair cameras.

CANON: Multiple camera shooting please!
ADOBE: Lightroom Multiuser Catalogues please.

Modifying USMT and KACE to capture Firefox settings and other specific programs

It’s that time in the hardware refresh cycle again where you have to replace laptops on mass, well at least it is for me.

Our main challenge was migrating users Firefox bookmarks and also the desire to capture Outlook signatures and auto-complete information without capturing all Office applications information (we wanted to start as fresh as possible).

I’ve never really dug in depth into the USMT and K2000 before now and I’ve found it in needed of a little massaging.

USMT Problem

The USMT definition XML file for applications (MigApp.xml) included with USMT 5.0 does provide support for many non Microsoft productions including Firefox, Chrome and Adobe Acrobat amongst other. The only problem is Microsoft hasn’t had the inclination to keep it up-to-date.

Thanks to some clues from fellow ITNinja Jegolf, I found that the MigApp.xml is hard coded to look for Mozilla Firefox 3 (hello cira 2008).

(assuming WAIK 8)

Edit the MigApp.xml files in both the “C:\Program Files (x86)\Windows Kits\8.1\Assessment and Deployment Kit\User State Migration Tool\amd64” and “C:\Program Files (x86)\Windows Kits\8.1\Assessment and Deployment Kit\User State Migration Tool\x86” folders.

The line to modify:

<condition>MigXmlHelper.DoesObjectExist(“Registry”,”%HklmWowSoftware%\Mozilla\Mozilla Firefox 3.*\bin [PathToExe]”)</condition>

<condition>MigXmlHelper.DoesObjectExist(“Registry”,”%HklmWowSoftware%\Mozilla\Mozilla Firefox *.*\bin [PathToExe]”)</condition>

After making these modifications, re-upload the USMT tool into the K2000.

Now if you choose the User Data tick box under Documents To Be Scanned in the K2000 USMT Scan Template, any version of Firefox will correctly be migrated.

Firefox specific migration (and other) without migrating ALL User Data

In the K2000 USMT Scan Template, if you tick User Data then it migrates anything in the MigApp.xml template which is anything from Firefox to all Office components to Acrobat etc etc.

This is not particularly helpful if you want to be more granular about what you take to ensure you don’t pass on redundant or out of date settings.

To customize the USMT Scan Template created in the K2000 is not as easy as it could be.

  1. Create a KACE USMT Scan Template and customize it with any visible settings but DO NOT tick User Data.

  2. Export this USMT Scan Template from the Package Management area of the K2000.

  3. Browse to the \restore Samba share and find the exported package.

  4. Extract the package with 7-Zip.

  5. Open the extracted file with notepad and copy the USMT XML component out to a new file.

    This begins with <Configuration> and ends with </Configuration>

    To add Firefox, you must add a new section to the file called Applications and within that section add the Firefox component. You can add this at the top directly below <Configuration>

    <component displayname=”Mozilla Firefox” migrate=”yes” ID=” firefox/settings”/>

    You can also specifically add other components by adding them to the applications section so long as they exist in the MigApp.xml.

    For Outlook 2010 this would be:
    <component displayname=”Microsoft Office Outlook 2010″ migrate=”yes” ID=” office 2010/container/microsoft office outlook 2010/settings”/>

  6. Save this file with XML file extension.

  7. In the K2000, open your USMT Scan Template and under the Content Configuration tick Specify config file.

  8. Browse and select the XML file you created and then Save the USMT Scan Template.

When you reopen this USMT Scan Template, the K2000 shows it in the Template GUI format but as this GUI is not aware of the Applications section of the config file it won’t be displayed. It does however exist and modifying and saving the USMT Scan Template will not overwrite it (an export of the USMT Scan Template proves this).

So, what have we learned:

a) Microsoft didn’t bother fixing this Firefox version number hard coding in the MmigApp.xml file. This is possibly a problem for Chrome and other applications mentioned in it.

b) KACE USMT Scan Template GUI is not aware of Applications section of config file.

c) KACE USMT Scan Templates are ALL or nothing for applications. Granularity of applications already built into USMT (anything listed in MigApp.xml) would be better.

d) the ‘Specify config file’ option in the KACE USMT Scan Templates is ambiguous as to the required format of the config file. I only got this working when I exported a template from the KACE (thank KACE support as I wasn’t aware you could extract the packages) and copied the XML.
The ability to directly save an example config or the current config out for modification would make it simple to add customisation.

Dell EqualLogic Virtual Storage Manager (VSM) hangs on VASA registration

(I may as well do something useful with this blog like add content Google can index to help people solve the same problems I’ve encountered in my day to day work.)

So we run Dell EqualLogic arrays at work and I’ve had a problem getting the storage provider to register with the VMware VASA service.

The Dell EqualLogic Virtual Storage Manager appliance (4.0.1) would hang with the message:
Waiting for VMware vCenter to register with the VASA Provider

After much troubleshooting with Dell ProSupport we found the issue was related to expired certificates in the Java services used by VMware.

There are two knowledge base articles you should refer to confirm and fix this problem.

DELL: in VSM prior to 4.0.1 there was a certificate distributed as part of the included JDK that expired in 2013.
Dell released a knowledge base article on how to fix this.
VSM fails to register with the VASA service on vCenter

In my case, is was the certificate on the VMware side that was expired. When looking in the VSM logs, the engineer only had the certificate date to go by as the name was not listed.
This matched the expired certificate we found when following the VMware knowledge base article on the issue.
Registering a VASA provider with vCenter Server fails and reports the error: InvalidCertificate (2079087)

Bug: The Dell VSM appliance doesn’t timeout or fail if it can’t register with VASA. It remains stuck in a retry loop. Ctl+C does stop it and show a failed message and point to a log file. However VASA is then reported as SET in the VSM console.
Hopefully this is fixed in future releases

Bug: Expired VMware certificate. I don’t believe I missed anything in the upgrade documentation whilst upgrading vCenter from 5.0 to 5.1 then 5.5 over the last 18 months.
Should VMware have flagged this during these processes or updated this certificate?

Phone phishing / fraud still going strong

Well it seems that phone phishing is sadly alive and rampant in Australia.

Yet another client reported they had been cold called by a company, name given as Global Computer Solutions, claiming their computer had errors.
They mentioned that Microsoft had passed on information to them that this persons computer had errors on it along with their contact details.
Of course anyone with some privacy wits about them would know that Microsoft would probably be breaching numerous privacy laws if this was the case. Come to think of it, when was the last time you bought a computer and registered Microsoft Windows with Microsoft (eg giving them your personal details)?

When challenged as to their identity, the caller gave their name and a number that could be called to verify who they were. Funnily enought they have a Melbourne office.
Well, not really, they just have a Melbourne number: 03 90160451 which I suspect just redirect back to India where the call centre is. (am I suspecting too much?)

Using my trusty friend Google, I see that this phone number is listed on two other computer repair websites.
Funnily enough they have other numbers for other countries and also, gee, the company’s addres is in West Bengal, India.

I’d really like to hope that the ACCC and the phone companys would jump on these companys and disconnect their services promptly. (or at least their local services)

Graham Cluley and his guest Sean Richmond discussed this very issue on a podcast.
Check it out, it’s not very long.

(Sophos 05 November 2010, duration 6:15 minutes, size 4.5MBytes)

In fact I reccomend you use the Sophos Naked Security blog as a trusted source of information about security related issues in the IT world, covering Facebook and Twitter to general security issues and news.
They make it very accessable for all user levels. They’re on Facebook too.

To those in the IT industry or those that have some web sense, these scams are nothing new. To those that are new to this, I hope this helps educate you.
To Google, I hope this helps add to the information that is already out there about these frauds to assist those looking for information.

Hello, this is a phishy call.

And that number again just to make sure Google picks it up: 03 9016 0451 0390160451

Roadtrip stats

And so the roadtrip 2011 has ended.

Here are some stats for you.

10,603 Km travelled

1,040 Litres of Fuel
$1,669 spent on fuel

9.8 litres per 100km, average

Most expensive fuel, $2.05 per litre (Balladonia WA ?)
Average fuel price, $1.60 per litre

Number of times tyres inflated and deflated, approximately a dozen.

497.7 gigabytes of video footage shot by the car cam
160+ hours of video footage shot by the car cam

1690 photos taken (Stephen probably took 5 times as many)

Number of times Stephen slept in the car, approximately a dozen.

Number of bottles of wine purchased, 16.

Day 21 – follow the smell of the coffee

View Larger Map

The final day in the roadtrip first took us east, back across the boarder to Mildura. Today was another 700 odd km day but we did make a few stops. We stopped in Mildura down at Lock 11 where we found it currently not active due to the river level being so high. I didn’t release the weir on the other side of the island (Lock Island) is completely removable. It’s currently up on dry land possibly looking like it’s having some repairs done to it.

Next stop was for lunch in Sea Lake and my first decent coffee in 3 weeks. Why does coffee just taste better in Victoria?

After another seemingly endless  straight road, we arrived in Whycheproof which has the distinction of being the only town in Victoria where a train travels down the main street. I’m pretty sure that I travelled on a SteamRail trip about 15 years ago that did exactly that. It’s hard to find any information confirming that the line is currently in use, although in the past few years it seems it has been for grain transport. There is a K series locomotive on static display next to the old turntable and they’ve restored the station building.

Next stop wasn’t until Bendigo for fuel and then on to Melbourne.

Sections of the road today were rolling fields of tumbling grass or weed. It was almost like driving through snow with drifts of this stuff piled up against fences and along the side of the road. It made the drive a lot more scenic that it otherwise would have been.