Recently I made the move to migrate all my VMs from running on VMware ESXi to running on Hyper-V. This was the first time that I have shifted to another platform away from ESXi as my main hypervisor platform.
As it was a single server, I made a backup of all my VMs with Veeam, taking also a copy of my config file and built a new Veeam Backup and Replication server on Hyper-V with importing the config file, allowing to restore all VMs across to Hyper-V.
Whilst all VMs booted up, the one issue I did run into was the ethernet adapter had changed from a VMXnet3 to a Hyper-V Virtual Network Adapter.
For Windows VMs, this was no issue and they all connected as if nothing had changed.
For Linux servers (Ubuntu in my case), the adapter name did change and the the Netplan .yaml file did not update. Thus, there was no connectivity to the outside world.
While the adapter in Hyper-V manager looked to have a connection, it was unable to display the IP Address as it was not associated with it. (This would be relevant to any hypervisor migration where the adapters aren’t exactly the same).
There is a very simple fix, as I mentioned above, the. netplan yaml file does not automatically update to reflect a new adapter name as there is no detection that runs like it does when installing the OS.
To find your new adapter, you can run: ip link show – You will be able to match the mac address on the adapter to the results in the ip link output.
Once you have your new adapters name, you can then update the .yaml file. Using the below commands, you will first search and confirm the name of the .yaml file and then edit with your preferred editor (in my case I will use vi). You can use tab completion to fill in the file name when you have the netplan folder specified
ls /etc/netplan/
sudo vi /etc/netplan/00-installer-config.yaml
Inside the .yaml file, you will need to find the line underneath “ethernets” – this will be your old adapter name and using the correct key combination, enter the insert (i key) function.
Using my screenshots, update the “ens34” to “eth0”
Press esc to escape the insert mode, then use :wq to write and quit vi.
Lastly, make sure to apply the netplan configuration. This is easily achieved by sudo netplan apply – Once applied, run through a ping test and make sure your network is functioning correctly.
My understanding, and experience so far, is that with Hyper-V the adapter name is going to be the same followed by ascending numbers (eth0, eth1, eth2, etc.) – However, it is always best to confirm first.
Hopefully that will save you some time when migrating between various hypervisors where the adapter changes.
With technology ever-changing, and businesses continuously facing security challenges, services such as Azure and AWS need to keep on adapting and improving their security posture to ensure that they continue to protect their customers and services. While this is certainly the correct thing to do, it does sometime create issues for the end users or other applications that are making use of those services.
If you are a Veeam Backup for M365 customer or user, you may come across the below warning messages when running a backup job. This is because Microsoft have made some additional security changes with different graph API access permissions. This isn’t really all that bad, minus the bits that have been skipped during that backup window, but it is extremely easy to resolve.
12/02/2025 9:07:25 PM :: Missing application permissions: Exchange.ManageAsApp. Missing application roles: Global Reader. Public folder and discovery search mailboxes will be skipped from processing, and a shared mailbox type will not be identified.
12/02/2025 9:07:25 PM :: Missing application permissions: ChannelMember.Read.All. Private and shared team channels will be skipped from processing
To resolve this, all you need to do is run through editing your organization once again, and allowing to. update the application registration.
You do need to make sure you still have the certificate in the cert store for which you used to create the application or recently updated with
Right click on the organization and select Edit Organization
Click Next until you get to the “Microsoft 365 Connection Settings” – Here you can select Use an existing Azure AD Application
Your Username and Application ID will appear in the boxes. Here you will need to then select Install and select your certificate that you used to connect to the application when you either created or last updated.
Ensure to check the Grant this application required permissions and register its certificate in Azure AD box. Click Next
You will then be request to copy the code and go to https://microsoft.com/devicelogin and log in with a GA account to authorize the change
Once Authorized you will then see the progress of the update.
Once all updated click Finish. You can then go ahead and start the job and wait for it to complete.
Once the job has completed, confirm that all was successful and no further warnings.
This should be resolve any issues Veeam Backup for Microsoft had for backing up using the Graph API.
It’s another week and Veeam has yet again realeased a new version. This time it is Veeam Backup for Office 365 and it is backing up the truck and dumping a sizeable amount of new features. Release notes here
Interface Changes
In v6, Veeam introduced the Self-Service Restore portal allowing customers to be able to log in with their M365 account and restore from their backup. In v7, this has since had an update and a facelift allowing user a greater experience.
While there have been some portal changes, the console has also had some minor changes made to it with colours and styling. Whilst only appear to be minor, if you spend enough time in the earlier versions, you will notice the difference.
Immutability
One of the newest features that has been around in the VBR world for the last few versions is Immutability. Having a backup is one thing, but is it truely a backup if it is able to be removed or minipulated? Veeam takes care of this by providing the addition of immutability, locking down your backups to ensure that they go untouched and continue provide a safe and secure copy.
Backup Copies
In keeping a safe and secure copy, Veeam have now impleted Backup Copies to help get closer to the 3-2-1 rule. (3 Copies – 2 Media types – 1 Offsite). Backup copies have been around for a very long time in the Backup and Replication product set allowing for multiple copies of the backup to be placed in another repository, whether that ben connected locally or via Cloud Connect. Veeam Backup for M365 allows the backup to be created on the attached local/on-prem repository and then the backup copy to push to a secondary location such as Azure Archive tier.
In order to create your backup copies, you need to create a location that will be able to accept the copy. If you are an Azure shop, for example, a new option in creating an Azure Blob Storage type has been added. Here you can select to use Azure Archive Storage for cost-efficient storage, which is a great place to send your backup copies to. If you do not have a repository configured similar to this, then backup copies will not be an available option.
New Object Storage Locations
It’s been no secret that Veeam have really been knuckling down to implement Object Storage across their product set, and slowly they have been adding it to each of their products. This year we saw Veeam Backup and Replication v12 introduce direct to Object Storage backups, whereas this has been in the M365 product set for some time as an offload target for the repository. Along with the interface changes, there is a new selection page when starting to create your repository. You will need to have created an Object Storage location under the Object Storage menu, but once that is available, you can select which repository type you would like to create as your primary target storage.
With the full support of direct to Object Storage, Veeam has also added native support direct to Wasabi Cloud Storage allowing for cheap, reliable Object Storage. In the past, you would need to configure a connection to Wasabi via an S3 Compatible storage type.
History Search
It might seem a little strange, but for me, this has been one of my biggest gripes with the Veeam Backup for M365 console. The lack of being able to search easily in the console when looking for historical backup reports has now been resovled with the introduction of the search bar. Here you are able to drill down to key words to find a log file for a particular job instead of needing to scroll through the long list of logs. This has been a very welcomed enhancement.
Conclusion
Overall, even though my list above doesn’t appear to be very extensive in the number of items, each one of these has a huge impact on productivity, efficiency and the ability to ensure your backups are both safe and secure and available for when you need them.
If i had to choose, my two favourite additions are Backup Copies and Immutability. These are critical to ensure that your data is safe. Data protection is one of the highest priorities a business should have on their list for running a business, it ensures that when the inevitable happens, you can get your business backup and runnings as quickly and efficiently as possible, no matter what the disaster is.
This is Part 2 of of my notes from the Pluralsight course “Extending PowerShell” by Matt Allford. As mentioned in part 1, these are just my notes that I took while reviewing the series again. These are in no way of replacing the course, and I highly recommend checking the course out. I hope you find something useful here that may encourage you to explore the PowerShell world.
Introduction
Exploring and working with new PowerShell Modules that are not installed by default.
Covering key components with working with the PS Package mgmt and repos.
Finding and Installing Modules
Updating Modules
Removing/Uninstalling modules
Offline Module installation
Understanding Package Management and Repositories
Linux uses YUM/Apt, etc to install and manage packages but that isn’t as easy on Windows. There is also brew for MacOS and Chocolatey for Windows that can be installed from a 3rd party. The Native windows Package Manager is built within PowerShell.
Should be keeping track of the software installed.
MS released PackageManagement (Formally called OneGet)
The PackageManagement Feature is an aggregator – Installed, update and uninstalled
Comes with a set of PowerShell Cmdlets
There are 4 pillars that make up the PackageManagement architecture.
End User: this is the interface in which we enter our actions to interact with the PackageManagement Core
PackageManagement Core: provides the functions of running the Discovery, Install/Uninstall and Maintaining an inventory of the packages.
PackageManagement Providers: The PackageManagement Framework provides and extensible plugin model tat allows for different installer platforms toplug into the framework. These are PowerShellGet, MSI, MSU, Program and NuGet by default. However, there is also 3rd party such as Python, AppX and PHP, plus more.
Package Sources: This is the source\repository location of where the Package Providers will pull the package from. E.g. PowerShell Gallery.
PowerShellGet.
Package Management Provider: One of the most common Package management providers, it is used to mange PowerShell modules, scripts and DSC resources.
Core functions – Discover, Install, Update and Publish. Packages.
It will target the PowerShell Gallery by default, and online repository, but you can also change the location.
It is available in Windows 10+, Server 2016+, PowerShell 6+ or WMF 5.1 +. It also required PowerShell 3.0+ and .NET Framework 4.5+.
PowerShell Repositories
Are a central location to share, store and Search for PowerShell content, this is a location that you would publish your PowerShell modules to in order to easily access . Also good to place your scripts.
Use PowerShell Get to find new repositories, install, update and Publish modules to the PowerShell repo.
Repos can be public and private.
PowerShell Gallery is the most widely used Repository.
DEMO: Working with Package Management and Repositories
When running Get-Command -Module PackageManagement you will see a list of cmdlets that will search, install and uninstall packages through the Package Management Framework
Running Get-PackageProvider will display the currently configured Package Providers – when running in PowerShell 7, you may only see a couple of providers, but if you run it in an earlier version, you should see several more. This is because PowerShell 7 may not support the other Package Providers by time of this writing.
PowerShellGet is the most common, however NuGet is new in PowerShell 7 and is the default PackageManager for the .NET framework.
Running Get-Command -Module PowerShellGet will display all the PowerShellGet commands that are available, this will also include Alias’ and Functions.
Running Get-PSRepository will return results for the repositories set up to install and save modules and scripts to. By default, PowerShell Gallery is set and is untrusted. When you run a request against the PowerShell Repo, you will be required to accept or decline if you trust the source. You can add other Public and Private repositories.
For Private repositories, you can take a look at this article from Microsoft: Working with Private PowerShellGet Repositories, Outlined are two types of repositories you can deploy for Private usage. NuGet and File Share. There are a list of Advantages and Disadvantages.
Microsoft: Working with Private PowerShellGet Repositories
Create a new Local Repo
Example from the course:
Run New-Item -Path C:\LocalPSRepo -ItemType Directory
From there you can create a new SMB Share on that folder by running
New-SmbShare -Name LocalPSRepo -Path C:\LocalPSRepo\
**Remember back in part 1 we discussed about Auto Loader. Here we have not manually imported SmbShare, however, autol oader has found the module and loaded it for us.
Run a Get-SmbShare again to see the new share added.
To register the repository as a default repository and trusted, you can run the below command, this will set up the source location and the script source location for the private repository.
The -Name does not need to be the same Name as what you have called the share.
The -InstallationPolicy will set whether the repository is trusted or not.
Run a Get-PSRepository again and you will see the Repository added. This will now allow the PowerShellGet provider to search through this new repo to find modules and scripts as well as still searching the PowerShell Gallery.
To set the PSRepository from Untrusted to Trusted or Trusted to Untrusted, you can do this by running the below:
A useful tool is the Find-Module command, which will help find certain modules. There are no required parameters, but you can use such parameters as -MinimumVersion, -MaximumVersion, -RequiredVersion and -AllVersions.
The –IncludeDependencies will display a list of other modules that are required for the module to function correctly. When you go to install the module, the other dependencies should also be installed with the module at the time, you can however go ahead and install those dependencies separately.
Running Find-Module on its own will return results for the latest versions of all modules available in the set PS Respositories.
Running Find-Module -Name *<name>* will search and return results that include the name in wildcard form.
Running Find-Module -Tag <name> will send back results with the tag that is referenced for a module.
**By default, the latest version will be returned. If you run the -AllVersions will display all available versions to find a specific number to install.
Alternatively, running Find-Command -ModuleName <module name> will display all the commands within that module, this will help when choosing if the module is the required module.
You can also drill down to find which module a specific command is located. Find-Command -Name <Command> will return a list of modules containing that command.
Running Install-Module -Name <Name> will install the module without any other parameters. However, there are a few parameters that are available, such as the minimum and maximum versions.
The -AllowClobber will give you the option to install a module even if the commands conflict with an already installed module.
-Scope defines the location in which the module is to be installed to, setting as CurrentUser will place the module in a path on the CurrentUser will be able to access (as available in the Environment Variable path) whereas setting the -Scope to AllUsers will place it in a default location that is, of course, available to All Users.
Depending on the version of PowerShellGet, the default scope for the module will be determined by the version that is installed.
PowerShellGet V1 = All Users (Preinstalled with Windows 10) PowerShellGet V2 = CurrentUser (Installed as part of PowerShell 7)
The newest version will be used by default.
To install a module, it is as simple as running Install-Module -Name <name>
Alternatively you can install to an alternate scope, however, if you do not have elevated privileges, you will receive the below error, and will not be able to install to the AllUsers default location.
Run the command on it’s own and it will go to the default scope, CurrentUser.
As the repository is untrusted, you will receive a prompt requesting if you would like to trust the repository. It will also advise you can set the InstallationPolicy against the repository.
As a bonus. When installing a module that is the latest version and there is already a previous version available, you will receive a warning that there is already a version installed. The Warning will request if you would like the latest version installed side-by-side.
Update modules
From time to time authors will keep their modules up to date, and it is best to update them to get patches and features related to issues running a cmdlet or new functionality.
To force the installation of an older version, you can run Install-Module -Name <name> -RequiredVersion <version>
To update this to a specific version, and not the latest, run Update-Module -Name -RequiredVersion <version>
** Updating a Module does not remove the previous version. This must be removed by using the Remove-module command.
When not specifying a version parameter, the latest version will be installed.
If the module was not installed by powershell, then Powershell will not be able to update the module as the PowerShellGet core does not have any record of it being installed.
Running (Get-Module).count will show the number of modules installed via the PowerShellGet framework. This will not show any that have been installed using another method.
Running (Get-module -ListAvailable).count will show all Modules installed on the computer and any that are were not installed using PowerShellGet.
Removing and Uninstalling Modules
Removing Module will remove the module from the session but not from being installed on your computer. It is the opposite of Import.
Uninstalling will remove the module from your computer = The opposite of using the Install-Module command.
Run Remove-Module -Name <name> to remove the module
The module is still installed and can be viewed by running Get-module -Name <name> -ListAvailable
When using Uninstall-Module the only required parameter is the -Name However, there is still the option to choose the version of the installed module. This is handy for when you update a previous version and need to clear out the older ones.
When running Uninstall-module -Name <name> (that’s not including a version) the latest version will be uninstalled and the lower version numbers will remaining. To remove all, then you will need to use the -AllVersions parameter
DEMO: Offline Module Installation
This is a method usually used in dark sites or air gapped sites that have strict security configurations, or require security teams to evaluate the modules before providing and marking them as approved modules.
Save-Module
This command can be used to download/save the module (Without installation) to a computer that is connected to the internet.
Save-Module can allow you to inspect the code of the module as you are not simply installing the module. The code is already pre-scanned in the PSGallery, but you can also inspect it yourself
You can still download using parameters for specific versions, but there are two required parameter. -Name and -Path. As you are not installing this and you are saving to copy to another location, you will need to specify a path.
Save-Module -Name <name> -Path <Path>
To install a module from an offline seed. The module should be copied to a default repository in the environment variable path. Once you know the location, you can run a Copy-Item -Path <Source> -Destination <Destination> -Recursefor the module to be located by the PowerShellGet Framework to detect
Diving into my 2022 goals and starting at revisiting PowerShell, this post is part 1 of my notes that I have taken while going through Matt Allford‘s “Extending PowerShell” Pluralsight Course. Please head over and check out the course and let him know what you think. There is a lot of indepth detail and knowledge on how to work with PowerShell Modules and context around each of the topics.
Snapins
Previously Snapins were used back in version 1 of PowerShell – they were adding commands as oppose to modules that hold series of commands.
Required installation from an installer file and registered to the local machine
Lacked capabilities like dependencies.
Modules
Modules – Preferred code package method.
New
Comes as a set of files, but some contain DLL files.
Allows for reuse and abstraction of PowerShell commands
Package of related commands, grouped together and can be vendor specific so that you only need to load the modules that are required. There are some built in modules that are imported by default and other modules can be added into the Environment Variable to load later on.
An example module available.
Bits Transfer Module
This module contains Get-BitsTransfer, etc. -= none of these commands work outside of bits. But all Get, Remove, Set, etc. are in a single PowerShell module
Each module is an upgrade to the previous.
Sit in the PowerShell repo
PSModulePath is default, but any location can be used
Copying and pasting a module is fine to import, but there are better ways to install and import
Module Auto-Loading can auto load in modules when you run your PowerShell, but you don’t always want this in case there are conflicting. e.g. Get-VMis both VMware and Hyper-V commands,
-noClobber is a command to prevent conflicting commands from installing when an existing command has already been installed into the session
Install Module (Install-Module)
The Install module does not import the module for use into the PowerShell session, but it does install the module onto the local machine into a path that is in the Environment Variable location, allowing for the module to be either imported into a session later or added into the Auto-Loading.
Using theInstall-Module -Name <name> will install the module from the PowerShell Gallery if it exists.
In the above example, once the module is installed, it is not automatically imported. The Import-module needs to be run. As this is running in PowerShell 7 Core, the Vmware.ImageBuilder is not supported and is not imported.
DEMO – Working with Module Paths
To find the PsModulePath, Use the $env:PSModulePath commands. However, you can also add -split “;” to separate out the lines where a semi colon is located.
To add additional default locations for modules, you can add them via $env:PSModulePath = $env:PSModulePath + “;C:\<My module path>”
By adding the semi colon to the start of the location above, this will allow the environment path to see this location as a separate path instead of just continuing the line of the previous path. E.g. C:\Windows\system32\WindowsPowerShell\v1.0\ModulesC:\<my module path>
Setting $env:PSModulePath will only be set for the sessions. In order to make it persistent, it will need to be added to the PowerShell Profile, or added to the PowerShell Environment Variable. This can be done via windows GUI or through Powershell
The above commands are first creating a cmdlet with the environment set. This will then allow the second command to run and list out the existing paths already set in the system environment variable path.
The 3rd lone is creating a new cmdlet that is adding both the existing system environment variable paths and the new path together (hence the + sign) – keep the semi colon prefix in mind
The last command is setting the System Environment Variable path to include the $newpath cmdlet. You need to make sure you add the previous Environment Variable location and cmdlet together with the new path each time to ensure the complete PATH is set.
DEMO – Importing and Auto Loading Modules
How modules get imported explicitly and automatically.
First run Get-Help Import-Module to view the mandatory parameter.
– Only required is-name Run Import-Module -name <name of module> to install the module.
Auto-loader
The auto load feature is used when a cmdlet has been run that is part of a module that has not been imported but is sitting in a module that is located in the Environment Variable Location. Powershell will search for the required cmdlet and import the module
Auto load won’t work if the module is not sitting in one of the Paths in the PS Variable Path.
Below in the example, using the command Get-SmbShare searches for the module and then auto loads the module as it is an available option.
Import a module that is not in a Environment Variable location
In the event you have a module that has been downloaded to another location that isn’t in the Environment Variable location, you can easily import it using the full location path in your import command.
Import-Module -Name C:\<Path to file>\
e.g Import-Module -Name C:\PowerShell\Module
Use Import-Module to import with a prefix
As above, we mentioned the -NoClobber parameter that will skip any commands in a module that are conflicting with an already imported module. Another method to avoid a conflict is to add a prefix, allowing the commands to be imported uniquely into the session.
In the below example, SmbShare is already loaded and does not contain a PreFix (Block-SmbShareAccess)
The Module is then removed and then Imported back into the session with use of the -PreFix parameter.
This will now add the prefix of “Pre” to the front noun in the command. (Block-PreSmbShareAccess)
So when running the command after importing with a prefix it will need to be,.
C:\> Block-PreSmbShareAccess
Demo – Indentifing Commands in a Module
Running Get-Command is not a great way to search for a particular command in a PS module. This will load every command that is available. This is far from ideal.
If you run Get-Command -ListImported This will only show you the commands for, you guessed it, the commands only available in the modules that are currently imported into the session.
In order to only find the commands of a particular module that is currently imported and not receive commands for other modules, then using the -Module parameter will cut this list down to only those commands that are required.
Alternate to getting the commands for a particular module is to use the Get-Module SmbShare – however, this will only listed some of the commands off to the side under the “ExportedCommands” column.
To view this better, use the (Get-Module SmbShare).ExportedCommands to display this in a nicer way with also the key and value columns.
Finding a command by using the Verb
Using the PowerShell Verb-Noun layout, there may be times that you want to do a particular action and know that the verb you need to perform the action. You can search on the verb that is part of the module to get a list of the commands that are available to help narrow down to the right one. This is done using the -Verb parameter followed by the verb to search for.
Get-Command -Module SmbShare -Verb Remove
Finding a Command by Wildcard
There may be a case where you only know a part of the command that you are looking for and need to perform a wildcard search. This works by still using asterixis on both sides of the word you are looking for.
This post may come as a bit of a surprise to some of you who have seen my focus mostly on Storage, Backup and VMware products, but the world is a bigger place and there are technologies out there that are evolving and my interests have done a complete 360 back to my earlier days.
This past week, we have seen the Microsoft Build conference take place streaming hundreds of sessions live from the presenter’s own home. For me, this has been a great opportunity for technologist to be able to be part of a conference they may not have been able to attend previously. I know for myself, I would not have spent the money previously to attend in-person, but that has more than likely changed now that I have had a a front row seat into the benefit of these sessions and what they have opened my eyes up to. For the past couple of months, I’ve been thinking about moving back into the Microsoft space, I’ve spent the last 6 years focused on virtual infrastructure and less on guest OS. I’ve been dabbling in Linux for a little, but I am heavily focused on using MacOS as I’ve found Windows to be a bit too restrictive, that was until Microsoft started releasing some new exciting applications that have made a world of a difference to me and I feel have created a much more inclusive ecosystem. I use Windows 10 in the office, but any other time I will use my MacBook Pro. Since Microsoft brought out WSL, I have been using Ubuntu for all my SSH sessions and doing general command line web operations. Prior to this, I would pull out my MacBook or fire up a Linux VM.
At Microsoft Build, there have been a number of new applications that have been released for preview that I am excited to see. The ones I am going to mention in this post are what I have found will help me in my day-to-day or just handy to have. There are many more out there, but here is my shortlist.
Windows Terminal
Here we find what looks like a standard Powershell window, but alas, it has much greater functionality that just running a couple of one-liners or scripts, by default, there are several preconfigured command line tools available, but you can also edit the settings via it’s config file and add more applications to keep them all in one spot. Windows Terminal may only be command lines, but it’s the way you set it up and use it, that’s what makes it so powerful. Windows Terminal is now 1.0WIndows Terminal 1.0
Windows Terminal Config
PowerToysThis has been a very welcomed addition, although still in preview, PowerToys adds extra tools to be able to customise and manipulate files, and customise your workspace with FancyZones.
FancyZones allows you for create layouts for applications and have them snap to certain areas of your desktop to create a useable workflow for our day to day activities. You can use the templates already provided or create your own layout. There are multiple colour options to show which zone is currently active and which others are inactive.
PowerToys FancyZones OptionsPowerToys FancyZones Templates
The next 2 PowerToys will eliminate the need for 3rd party applications to be able to do bulk file renames and also image resizing. Don’t you hate I when you take a new set of photos for an event and they are all need something like “img_xxx”? – Well now you no longer need to either sit there renaming one by one or running through a 3rd party tool. PowerToys brings with it a simple File Renaming feature “PowerRename” that can be pinned to your explorer menu, allowing you to easily select your files, right click and Rename Files. The tool uses a “Find and Replace” method, so you can save time and only rename certain words, if needed.
The Image Resizer works in a similar to the File Renamer. In the settings, you can create your presets for your image sizes and then you just select your images, right click and choose the Image Resizer. You are then presented with a window to select the size you would like all your select images to be.
PowerToys PowerRenameImage Resizer
Image Resizer Settings
Windows Package ManagerIf you have ever used Linux, whether it be Red Hat or Debian variants, you should be well aware of the yum (dnf) or apt package managers (some others exist for other distributions). Microsoft are now adding yet another feature to Windows to be able to move more inline with the evolving community and have release their new Windows Package Manager. Currently also in preview, it allows you to install recent releases of Windows applications through the commend line using winget. Simple commands such as winget install and winget showwill allow you to find and install available applications. The winget show command will list the applications and their most recent version available. Once the application has been installed via the package manager, you will then be able to use it as any other application installed from an executable or msi file.
winget install
winget show
Microsoft has certainly shown that they are listening to what users want, and are learning what people need to be able to operate in their day jobs. Previously, we needed tools such as putty to connect to SSH sessions, but with Terminals and WSL2, we are now able to connect with all in one tools from one operating system. I am excited to see what else is out there coming from Microsoft and where they roadmap is heading. If you missed any Microsoft Build sessions, you can head over and watch them on-demand.