Creating an SSH Key on macOS for Automatic Login to Linux

Lately I have been working a lot in a terminal, specifically with Linux VMs in my home lab environment. Logging into multiple VMs over and over again over SSH has become fairly repetitive, that is until I created myself an SSH key for automatic login to my VMs.

To make things easier in the future when it comes to logging in via Terminal, you can setup SSH keys, so you won’t need a password when you login.

In my lab I have been primarily working with “minimal” installations of CentOS 6 and CentOS 7. When you do a “minimal” install of CentOS, the installation doesn’t install services like rsync and scp by default.

To install rsync and scp run the following on your CentOS client:

While still connected to my CentOS client, I created a directory to store my RSA keys with the following command:

With the Terminal open on your macOS system, enter the following command to generate an RSA key for login:

Press enter three times to accept the default settings.

On my macOS client, I like to tighten up the file system permission via the following example:

To verify our permissions we can run:

On my macOS client, I see the following:


Now that we have a key, the next thing to we need to do is copy the key to a directory on the server we intend to SSH to via Terminal. This will allow for password-less logins. With the IP or DNS name of the server we intend to connect to in mind, we can enter a command similar to the following to copy our key to the CentOS VM:

Like my macOS client where we created the RSA key, I will tighten up my CentOS VM SSH directories with the following commands from my macOS Terminal:

We should now be able to SSH from our macOS client to our CentOS VM without providing a password!






Installing PowerShell and VS Code on Mac OSX

“If someone told me 10 years ago that we (Microsoft) would offer SQL Server on Linux, I would have laughed them out of the room.” – Presenter while at Microsoft HQ Redmond, WA

It’s 2016 and Microsoft is breaking down the walls to its golden garden. With recent announcements coming from all across the board at Microsoft, it really is an interesting time. Never before have the tech giants been so open to sharing software across distributions and vendors.

In the early 1980s, Richard Matthew Stallman began a movement within the software industry, preaching that software should be free.

“Free software” is a matter of liberty, not price. To understand the concept, you should think of “free” as in “free speech,” not as in “free beer.”

Why would anyone in their right mind want to give away their software for free? I think the better question is: “How do we enable ourselves as developers and users, while protecting ourselves at the same time?” A user of the software should never be forced to deal with a developer who might or might not support that user’s intentions for the software. The user should never have to wait for bug fixes to be published. Code developed under the scrutiny of other programmers is typically of higher quality than code written behind locked doors. (We have been looking at you Apple and Microsoft) One of the great benefits of open source software comes from the users themselves. If a user desires or has the need for a new feature, they can add it to the original program and then contribute it back to the source so that everyone else can benefit from it. It is this sole reason why the popularity of GitHub has risen so dramatically in the last couple of years.

This line of thinking sprung a desire to release a complete UNIX-like system (Linux) to the public, free of license restrictions.

Just a couple of weeks ago Microsoft released this blog article announcing its latest “customer obsessed” move and to further convince us that “Microsoft loves Linux.” PowerShell and Visual Studio can now be ran on Linux and OSX.

It’s about time.

I have been running OSX Sierra on my 2015 13-inch MacBook Pro for as long as the public beta has been available, and now I can edit and run my PowerShell scripts directly in my Terminal.

Installing PowerShell

Firstly you will need to download the PKG package powershell-6.0.0-alpha.9.pkg from the page from here onto your macOS machine.

Either double-click the file and follow the prompts, or install it from the terminal.

Screen Shot 2016-09-04 at 1.13.35 PM

Once the package has been installed, running the command powershell puts you directly into a PowerShell session in your Mac OSX terminal:

Screen Shot 2016-09-04 at 1.12.00 PM

$PSHOME is/usr/local/microsoft/powershell/6.0.0-alpha.9/, and the symlink is placed at /usr/local/bin/powershell.

Installing VSCode


  1. Download Visual Studio Code for Mac OS X.
  2. Double-click on the downloaded archive to expand the contents.
  3. Drag Visual Studio to the Applications folder, making it available in theLaunchpad.
  4. Add VS Code to your Dock by right-clicking on the icon and choosing Options, Keep in Dock.

Tip: If you want to run VS Code from the terminal by simply typing ‘code’, VS Code has a command, Shell Command: Install ‘code’ command in PATH, to add ‘code’ to your $PATHvariable list.

After installation, launch VS Code. Now open the Command Palette (⇧⌘P) and typeshell command to find the Shell Command: Install ‘code’ command in PATH command.

OS X shell commands

After executing the command, restart the terminal for the new $PATH value to take effect. You’ll be able to simply type ‘code .’ in any folder to start editing files in that folder.

Installing PowerShell Extension

Launch the Visual Studio Code app by:

  • Windows: typing code in your PowerShell session
    • Press F1 (or Ctrl+Shift+P) which opens up the “Command Palette” inside the Visual Studio Code app.
  • In the command palette, type ext install and hit Enter. It will show all Visual Studio Code extensions available on your system.
  • Choose PowerShell and click on Install, you will see something like below


  • After the install, you will see the Install button turns to Enable.
  • Click on Enable and OK
  • Now you are ready for editing. For example, to create a new file, click File->New. To save it, click File->Save and then provide a file name, let’s say “helloworld.ps1”. To close the file, click on “x” next to the file name. To exit Visual Studio Code, File->Exit.

For further information, check out Microsoft’s documentation on GitHub:

Managing contact objects in AD when Exchange was never there

Have you noticed that if you have never had an Exchange server in your Active Directory environment, that it becomes extremely annoying to manage contact objects? I just recently came across this nuisance.

Recently tasked with Domino Notes to Exchange Online migrations, creating contact objects of the Notes mail users that contained the relevant attributes needed to migrate presented me with a truckload of contacts to validate.

Thanks to the powers of automation, the go to cmdlet that comes to mind when opening up my PowerShell console equipped with the AD Module, is:



But…. But Microsoft……

Turns out, if you want to manage contact objects in AD using PowerShell without the availability of the EMS, the cmdlet you actually want is:

I am sure Microsoft has their reasons, but since I wanted to manage ADSI attributes of my contacts, it left me scratching my head. How am I to bulk change attributes for contact objects using the AD Module in PowerShell?

In my case, I needed to fix the mailNickname attribute as it had been appended with the Notes users e-mail address instead of just the syntax of the username and soon to be mail user alias in Exchange Online.

Well, luckily I was able to put something together after prowling the all knowing Google for answers

Using the Get-ADObject cmdlet, I was able to target the OU containing the contacts I wanted to manage and select the Name, ObjectGUID, and mailNickname ADSI attributes for manipulation. I pipe that into a Set-ADObject for each (fun fact: the “%” sign is an alias for: ForEach-Object) of the contacts to replace the mailNickname with the whatever the mailNickname is currently set to, minus the “@” symbol and anything that follows it.


One task complete.

The other attribute fix identified itself as the removal of a proxy address from the proxyAddresses ADSI attribute for each contact object. For this task, I was able to target a specific OU containing all the contact objects in question and remove the bad apple proxy address with the following:

Calling in the LDAP ADSI initiator, and specifying a ForEach, I was able to get each contact (In my case, only contact objects existed in the OU’s I targeted) that had an address like “*” Following this you’ll notice I pipe that into a ForEach contact, set the ADSI proxyAddresses attribute to PutEx 4 to the attribute proxyAddresses and gave it a value of nothing with the blank string ($_).

If that went way over your head like it did for me the first time I saw the method, don’t worry. Microsoft has a great article that clarifies this in much more detail:

HOW TO: Use ADSI to Set LDAP Directory Attributes

PowerShell managed to save me from hours of manual attribute changes after all, even when Exchange was never there.

As always, happy scripting!

Connecting to Microsoft Online Services via PowerShell Sessions

It can be fairly annoying to have to run several different sets of commands in your PowerShell console window just to connect to the service online that you are working on.

John Weber and I were having a discussion about how annoying it was, and he and I couldn’t help but ask: “Is there an easier way?”
Utilizing Windows PowerShell, we cumulatively came up with the idea to pass a single entering of credentials, and in turn, log into each environment we manage most in their own respective PS console environment.

I use ConEmu as my PS CLI interface of choice rather than the native console and one of the best features of ConEmu in my opinion is the tabbed console windows. I like to separate my work so I can better keep track of the commands that I am running, especially when I am running a lot of them. The idea of establishing connections in separate console windows sounded like a great idea.

Therefore, using the powers of PowerShell (pun intended) I put together a script to auto-magically do this for us.

If you would like to download the script, click the link below:

Download Link

Microsoft Portal Links

So many links…

With all of the changes and upgrades that Microsoft has been making in its growing and maturing cloud space, it is incredibly easy to get lost when you need to access a specific service to manage.

In my notes, I have recently compiled the most recent relevant links to the web portals I log into the most to help ease the direction in which I need to navigate to, when managing my Windows Cloud Services.

Office 365




Microsoft Partner Network

Set the new domain as the default “Log on to” after ADMT migration

I recently worked with a client that was migrating from one domain to another using Microsoft’s Active Directory Migration Tool or ADMT v3.2.

One of the customers concerns, like many others was: “What will the impact be on the end users post-migration?”

If you are like most IT administrators, if you can help it, you probably want to minimize the amount of help desk tickets you receive on a daily basis. Migrations from one domain to another has “help desk ticket hell” written all over it.


One of the concerns associated with using ADMT as the primary migration tool of choice was how it handles the presentation to the user when they show up to their computer Monday morning after a weekend of migrations.

Unfortunately ADMT does not get super complex with its offerings but perhaps this is intended by Microsoft. After all, it is free.

That said, I encountered the following scenario:

ADMT v3.2 by design leaves the previously logged on username and their associated domain intact after migrating from a source domain to a target domain. Windows by default caches the previously logged on username, and ADMT v3.2 does not make any changes to this.

My customer had expressed a business need to help mitigate tickets to the help desk, as well as provide the most transparent experience to the end user, to have this cached domain and username changed or removed.

Basically, the customer did not want to have users show up Monday morning at their workstation that was recently migrated to the new domain, attempt to login on the machine using their previous domain credentials because Windows decided to keep the cached value of the last logged-on user.

This customer had reasons to keep the old account active in the source domain, and since everything was on the same network and subnet internally, you could imagine the headaches that could come if Bob from HR logs in using his old credentials on his now migrated workstation.


Due to the nature of ADMT and Group Policy, it was determined that the best solution to resolving this issues to meet this need was to perform the following to remove the cached last logged on username and domain:

  • In the source domain create an OU by the name of “Pre-Migration Computers”
  • Create a new GPO that enables the policy “Do not display last user name in logon screen.”
  • Prior to migration, place the computer objects that are to be migrated in the “Pre-Migration Computers” OU to allow the group policy to apply to the computer objects ahead of time.
  • In the target domain, create an OU by the name of “Post-Migration Computers”
  • Create a new GPO that enables the policy “Do not display last user name in logon screen.”
  • Prior to migration, place the computer objects that are to be migrated in the “Pre-Migration Computers” OU to allow the group policy to apply to the computer objects ahead of time.
  • At the time of the migration, migrate the computer objects from the “Pre-Migration Computers” OU in the source domain to the “Post-Migration Computers” OU in the target domain.
  • Once it has been determined by the customer that the end user has had enough time, or has been verified to have logged into their workstation post-migration with their new credentials, the computer object can be moved freely to any desired OU regardless of whether the GPO created as a part of this process is applied to it, or not.

Skype for Business Online Hybrid Coexistence – User Moves with Move-CSUser

While onsite with a customer who was having John Weber and myself setup a Skype for Business Hybrid coexistence with their previously configured Office 365 tenant, we ran into a conundrum with setting up a PS Session from one of their existing edge servers.

Due to their security measures, (i.e. various internal firewalls and proxies) we were unable to establish a PS Session from their network to SfB Online. This is because the WinRM protocol 5985 was most likely blocked.

We discovered this when trying to establish a PS Session to run the following command from one of the edge servers:

Since we were not able to run the command from their network environment, we were however, able to run the command from a hotspot connection and change this value thus establishing coexistence.

The customer wanted to be able to manage their coexistence (aka perform the user moves) from one of their edge servers, and they were concerned with how they would perform their user moves using the Move-CSUser cmdlet from their edge server considering that WinRM was not able to establish a PS Session.

With this customer, making the change requests to open this port could take weeks to get approved.

In our troubleshooting, to make matters worse, we could not telnet over port 443 from the server to the URL we were trying to pass credentials to as part of their O365 tenant when performing the steps to perform the move with the Move-CsUser cmdlet.

We ended up being able to run the Move-CsUser cmdlet regardless of all of the above, and so this caused us to contemplate how the command is perform the moves to SfB Online, so we could provide the documentation to the customer.

The only conclusion we could think of, because we could not find an answer after a few hours of research to present to the customer, was that it must have been passing the credentials to their DirSync server, and since they had already synced users to O365, the DirSync server must have been passing the credentials, and the other changes needed to effectively show the user as “In the cloud.”

It seems that for now, the way the cmdlet Move-CsUser operates is that of Microsoft “Voodoo” magic until I or someone else gets a chance to analyze network traffic being passed during a passing of the Move-CsUser cmdlet when moving users from on-premise’s to Skype for Business Online.

ADMT Breaks Default File Associations Registry

On my most recent project with a customer, I decided to take on a task of filling in for a fellow engineer who became unavailable on a project he was working on.

Since I was on the bench, I figured why not take on a few hours of ad-hoc work to fill in some time.

With this particular work, I was tasked with resolving a couple of issues, specifically one that intrigued me when using ADMT v3.2.  (Active Directory Migration Tool)

Not too long ago, Microsoft updated ADMT v3.2 claiming support for all currently supported operating systems including Windows 8.

However, I don’t think it is fully supported, nor works as well as Microsoft would probably like to say it does.

With this customer, they had conveyed that they had run into the following issue:


Use ADMT 3.2 to migrate Windows 8* machines to a new domain and forest.


The migrations finish successful without any problems but when the user logs on to their migrated workstation, all of the default file extension associations are lost. Even after choosing a program to open the file in question, the prompt turns up again if the user chooses to open the file again after closing it. This happens for just about every major default or manually set file extension association such as .jpg, .pdf, .xls, etc.

Why Microsoft? Why?

Turns out that I am not the only one that has been running into this issue however….

In this Microsoft Forum Post several other people have mentioned this issue as far back as 2013!

Until recently, the issue has been closed.

Their support moderator says that “For File Associate issue, is should not related to ADMT.”

By the way, you read that quote correctly. I don’t think this moderator understands the issue, let alone knows how to type out proper English.

Another user does a very good job of describing exactly what is going on, and in more detail to their individual experience:

“The ADMT migration runs perfectly, without issues, but upon completion when a user logs into their PC all their file associations are broken – reg keys under HKCUSoftwareMicrosoftWindowsCurrentVersionExplorerFileExts all seem intact, but until I go in and delete the UserChoices reg key I cannot set these file associations again!

Looking at a PC before and after migration, there doesn’t seem to be any changes (other than the new user is added to the permissions obviously)!

Interestingly, after deleting user choices, I am still not prompted to set “default program” when I double click an affected file type…I have to right click and “Open With” before I see the checkbox to set the default!

Very strange behavior and must be something to do with the security translation of the registry as part of ADMT if you ask me?”

All in all, I was getting nowhere with what I could find, as this forum was the only thing that actually had some relevant pointers for me that applied to what I was experiencing with my customer.

The Fix: 

I was able to fix the issue after trying many different approaches.

One of the strategies was to implement a GPO that set the default file associations on the computer objects post-migration.

The policy I implemented sets the default file associations using the GPO under: “Computer ConfigurationAdministrative TemplatesWindows ComponentsFile ExplorerSet a default Associations Configuration File”

Using the default dism.exe file, as the policy suggests, seems to hold no results even after verifying that the policy was successfully applied to the computer object.

Regardless of this implementation, various file types i.e. .jpg, .bmp, all prompt with an inquiry asking which program you would like to use to open the file, effectively resulting in a non-resolution.

Obviously this is an unacceptable after-effect of the migrations that where promised to be as seamless as possible to the end users.

What did work however, was creating a GPO to run a startup script to run regedit.exe and import a .reg key which sets all the default file extension associations back to the default!

My steps were as follows:

  • Create or obtain .reg keys which contain the settings for file extension associations that will result in the desired effect.
    • Example .reg keys that can be downloaded can be found at:

Placement of Registry Keys (.reg files)

  • Create a GPO in the domain to run regedit.exe with the “/s” parameter specified when pointing to a .reg file set to change the registry keys back to the default settings defined by Microsoft for Windows 8 and Windows 8.1.

Startup Properties of the GPO

  • Apply the GPO to a designated migration OU container
  • Migrate workstations to the target migration OU which has the GPO applied to it.
  • Allow the GPO to apply to the migrated workstation.
  • Once the GPO has been applied, the workstation can be moved to any desired OU, without effecting the results of the applied GPO on the designated migration OU that the file extension GPO was applied to.

Additional Notes:

In my testing, I tried to recreate the registry keys manually (by hand) on the test machine post migration to that of the default key values, regardless of my deleting and creating of new keys, they held no results, even after a reboot of the local machine.

My registry keys included in the GPO I made touch each store on the local machine registry that handles file associations. It deletes the old entries and creates completely new ones that mimic the default associations.

Personally, I am  leaning towards jumbled security permissions post migration, because I couldn’t change the values of the keys manually. I would see an error saying I couldn’t change it. Not specifically a permissions error. I had to delete it all together and create a new key.

The reason why I am not inclined to think the default program definition is a player in this issues, is a result of my experience manually specifying the default program when opening a .jpg for example. I could specify the default program, but it never stuck, meaning every time I opened the .jpg it would prompt again to choose the program I want to use to open it.

Very peculiar behavior. I am inclined to agree with one of the contributors in the forum link I shared with you when he says: “Microsoft needs to understand customer impact so it can prioritize work.”

Can’t Verify Domain in Office 365

If you have been creating trial tenants over and over again like I have been with Office 365 and their E3 license, it is likely you may run into the same issue I have.

Microsoft does not really give you any hints in regards to troubleshooting your domain verifications.

I am currently going after the MCSA for O365, and the way I have been able to learn the ropes of O365 for the 70-346 and 347, has been purely from signing up for their E3 trial.

No need to spend any money at all, and you can get Sharepoint, Exchange, SfB, etc. Additionally, you get 25 user licenses to play around with.

“What if my trial ends?” you say?

The good news is that you can extend the trial an additional 30 days! (totaling 60 days.) I have a single domain name I used to work with O365 and a lab environment at home that I setup an Exchange hybrid in, manage Sharepoint Online, etc.

Worst case scenario: You use your total of 60 days, and MS asks for money. Just buy another 99 cent throw away domain from or and start another trial. Once MS trashes your original domain which expired, (30 days after your original 30 day, or extended 60 day trial ends) you can use that same domain name again as an O365 tenant and start again.

Rinse. Repeat. Profit.

The legacyExchangeDN and why it’s a pain in my ***

After a recent Exchange migration project, I can say I officially find the legacyExchangeDN attribute to be a real pain in my ***.

I spent a lot of time troubleshooting and resolving issues in regards to mail delivery due to this little attribute which has caused me to do a small write-up on it’s importance, and what to do when you screw things up like I did.


We are often tasked with migrating from one email system to another, especially in regards to Microsoft Exchange. This could also come in the form of migrating from one domain name to another such as when a business changes it’s name, or when a company acquires another company.


In an effort to try and make co-existence work, there are several things to consider. For example:

  • Mail Contacts
  • Distribution Groups
  • Mail Users

Once I came up with my messy and convoluted scripting through the Exchange management console, I was able to successfully collect an NDR which contained a string that looked like a bunch of jumbled verbiage. Definitely not human readable.

But, thanks to the wonders of google, I was able to find a way to not only crack the code, but re-apply my lost x500 to my user objects!
*Obscure web ad logo goes here* “I used this one simple trick and techies hate me!”

Just kidding, here is the way I resolved my issue straight from:

The one thing that seems to always come back to bite us is when these mail contacts or mail users are converted to a mailbox, is that all of a sudden users start seeing NDR’s in Outlook barking that a recipient has been deleted and is unavailable.

This is caused by Exchange, which uses an X500 address to route mail internally. Once the attributes have been removed from a user object or contact object in Exchange, and you create a new mailbox for that user or contact, Exchange will automatically create a new X500 address and apply it to the object.

With that said, collecting the legacyExchangeDN attribute from all objects is a must from any disabled or deleted contacts and mailbox users prior to disabling or deleting them.

I found out the hard way, that when you disable a contact in Exchange prior to collecting this attribute, Exchange will wipe all Exchange related attributes including the legacyExchangeDN.

Prior to Exchange 2010 SP1 Rollup 6, the legacyExchangeDN value was something that you could predict and delineate by viewing the syntax of an enabled user. However within newer version of Exchange and within Office 365, Exchange generates a 3 random hex character which it appends to the end of the attribute.

That means you are pretty much screwed in regards to migrating between organizations if you disabled a mail enabled user or contact without collecting this attribute prior.

Or are you?

Well you still could be, but there is a chance you could find it if you make a mistake like I did.

Solving this situation comes down to reporting in Exchange. If you don’t want to wait for users to report on NDR messages, you can search the transport logs for failed deliver messages. This can be accomplished by utilizing a PowerShell query to get the message tracking log.

To create an X500 proxy address for the old LegacyExchangeDN attribute for the user, make the following changes based on the recipient address in an NDR:

Convert all _ to /


Followed by….


  • Replace any underscore character (_) with a slash character (/)
  • Replace “+20” with a blank space
  • Replace “+28” with an opening parenthesis character
  • Replace “+29” with a closing parenthesis character
  • Replace “+2E” with a period
  • Delete “IMCEAEX-“
  • Delete “”
  • Add “X500:” at the beginning.


IMCEAEX non-delivery report when you send email messages to an internal user in Office 365:

1. Clear the auto-complete cache file
2. Create an X500 proxy address for the old LegacyExchangeDN attribute for the user


Mystery of adding X500’s – Seriously awesome article

Fixing IMCEAEX NDRs – Missing X500 Addresses:

All in all this totally saved my life, and definitely caused me to think about how I am going to go about my deleting of contact objects.