Up until recently, I was familiar with the idea of client pulling down information from a database. A database requires updates, patching, and maintenance. Usually, I’d be the one doing all that care and feeding, additional to taking proper care of the client application. Recently I’ve been playing with api-ninjas.
API-ninjas requires a free account, and provides back to you an API key. Some of the things you can pull include:
Airline flight information
Cocktail recipes
Conversion of Currency
DNS lookup
Holidays
IP information
The list is quite exhaustive. An API is a great alternative to that pesky database maintenance above. Api-Ninja includes code on pulling the API’s in python, C#, ruby and so forth. However, it did not include anything about powershell.
Below, I’ve pasted code to get a random fact of the day:
$apikey = "###supersecretapikey###"
$params = @{
"Uri" = "https://api.api-ninjas.com/v1/facts?limit=1"
"method" = "get"
"headers" = @{"X-API-Key" = $apikey
}
Invoke-RestMethod @Params | fl
##Sample output##
fact : A sneeze zooms out of your mouth at over 600 m.p.h
To use or change this code, change the Uri paramenter above to the value given by api-ninjas. Examples include:
https://api.api-ninjas.com/v1/cocktail?name=$input #for taking input for cocktail recipes
https://api.api-ninjas.com/v1/iplookup?address=$IP #for taking input for IP address
In my Azure Tenant, I have a VM, a domain controller that hosts, well… my domain.
I only use it for testing, most recently I was doing some SSPR testing. I only turn it on occasionally for testing some powershell scripts, this password reset utility, and other things that only an on-premises Domain Controller can really do.
Over time, over about 2 weeks I didn’t need it and had this server sitting in a powered off state. When I did need it again, after powering it on, I realized I couldn’t login with my Domain Admin credentials. The error was that my password had expired, and I needed to reset it.
Okay, I’ll use my backup Domain Admin account to reset it. The problem was, the backup Domain Admin account was giving the same error.
Uh-oh.
My primary, and backup domain admin accounts to my one cloud controller that is not replicated anywhere are both locked out. Now what?
Luck has it, there’s as way to do this that’s fairly painless and actually quite simple.
Create a .ps1 file. The only contents it needs are one line:
Net user AD-Admin NewP@ssword!
Name it something relevant like “password_reset.ps1”
This HAS to be an account that’s active in your AD, and perferrably a Domain Admin account. The password can be whatever you want, as long as it fits your password domain policy.
2. Goto portal.azure.com -> Storage accounts -> any_of_your_storage_accounts ->containers (create one if you have to) -> upload. Upload the .ps1 file you created in step 1 above.
3. In portal.azure.com -> Virtual Machines -> Your_VM_DC -> Settings -> Extensions + Applications -> Add a Custom Script Extension
Browse to the storage container in step 2, point to the .ps1 file created in step 1
Let the deployment run
6. Log onto your DC VM in Azure with the credentials from step 1 above. RESET any or all your domain admin passwords that have that requirement.
7. Uninstall and delete that Custom script extension from step 3 for this VM. Otherwise, every time it boots it will reset the password for this one user.
Delete that .ps1 file from the storage container too!
The last firmware released for the DNS323 was back in 2013. That was quite a while ago, and it wasn’t great. It lacked SMB2, ssh out of the box, and no development of popular applications. I tried Alt-F on a spare DNS323 as a test to see if I could get rsync up and running.
This isn’t meant to be an expansive entry of the pro’s and con’s of this firmware. This is supposed to be a straight forward approach of configuring the DNS323 as a rsync target for backups compatible with synology dsm 6.3.
Let’s not kid ourselves, this device is pretty old. The last time it was sold any where was around 2007. As of this writing that was 14 years ago. The processor is 500MHz, it’s got 64MB of RAM, the max data transfer possible is 10MBps. I do NOT recommend putting any sort of production or super-important data onto this. I’m using this because I love to tinker, and I have an over-abundance of spare harddrives. So please, as interesting as this entry is, if you want something with performance look at a modern NAS and drives with warranty and up to date specifications!
Moving along…
The coles notes version of alt-f installation:
Download the latest alt-f firmware
Log into your DNS323 and apply the alt-f firmware
*I take no responsibility past this point. These instructions are recommendations, and should not be taken verbatim. This is not an official support channel. Take all the necessary precautions to backup your data beforehand.
3. Create a login password, this will also act as your ‘root’ password too.
4. Format your disks. EXT2/EXT3/EXT4 and few others are available.
It’s your choice to stick with a RAID 1/0 or JBOD. I’m using older disks and this is strictly backup for my purposes.
Create the Rsync User
Let’s create an rsync user first.
Setup -> Users
Note the full name is the “windows name”, where the nic name is the “linux login” name. Take particular note of the linux name, this is what the synology needs to initiate a backup.
Create a folder and Share
Now we’ll need to create a share to mount the backup.
Setup -> Folders
Note the mounted drives. I configured mine independently.
Sda2 – 500GB drive
Sdb2 – 1000GB drive
I gave mine a share name of “backup_share”. Then hit ‘create’.
Once created, change permissions accordingly.
With the drive folder and permissions set, now configure the share.
Services -> Network -> smb -> Configure
Create a share based on the folder you created ealier
As a test, make sure you can browse the share from windows explorer
Use the username and password you created above. Make sure you can create files and folders. Notice you can enable SMB1, and SMB2 from this panel. I tried to disable SMB1, but that just made the share disappear from my Windows 10 explorer. Could be a bug they’re working out.
Side Quest – SAMBA module
There’s also an ‘Advanced’ button in Samba Setup. Use the same root password to see the contents.
This panel is a bit more graphical in presentation. And gives a good representation with the ‘view’ icon of the current shares published. Spend a little time looking around, there could be some tweaks you could find useful in this section.
Rsync Service Setup
Let’s setup this DNS as the rsync target.
Services -> Network -> inetdS
Hit ‘configure’ on the rsync service
Configure a new folder based on the path and user you created above.
It’s easier to use the built-in browser to get to your folder. Otherwise if you know it already you can enter it here. Remember, this is linux, all the directory slashes are ‘/’
The module name is the viewable share name in Windows
Add your comments as necessary
Set permissions for the rsync account created above
Now, let’s validate the folder created above (ie. /mnt/sdb2/backup_share) exists in the rsync configuration folder. We’ll use an SSH client for this. Just regular connection with root@DNS323 works. Goto \etc and more on rsyncd.conf.
The top line should give the location of rsyncd.secrets – a password encrypted file that only rsync users should have access to.
And the bottom portion should provide the recently created directory with permissions for your rsync user.
PS C:\> ssh root@dns323-2
root@dns323-2's password:
[root@DNS323-2]# cd etc
[root@DNS323-2]# more rsyncd.conf
#Sample contents
secrets file = /etc/rsyncd.secrets
use chroot = yes
read only = yes
dont compress = *
list = yes
...
You can tweak this to do things like host allow within a certain subnet. For this, I’m just focusing on getting rsync running.
While you’re in here, have a look at your rsyncd.secrets file. Ideally, this should only give one rsync user with password. Something like
rsynryn:password
DSM – Setup HyperBackup
Now we can create a backup job and target the DNS323 (with alt-f firmware). Create a new backup job, choose rsync as the file server type.
Settings should be similar to below.
For the backup settings, configure the Server type as ‘rsync-compatible server’, enter in the pertinent details of your DNS323. It should look similar to the screenshot below. For port, just keep the default 873. The Backup module, make sure to use the exact same “Path” from the rsyncd.conf file.
ie. path = /mnt/sdb2/Backup_share
Backup module = /mnt/sdb2/Backup_share
Directory = Backup_directory
And this creates a new directory of whatever name you want.
After this you should be able to select items to backup. Set your items, schedule them and make use of the rotational backups (very handy).
Be aware of the speeds, even if you have SMBv2 enabled, the backup jobs are still pretty slow over rsync. Still hovers around 1.2MB/s. So time your backups accordingly, and be aware that DSM Hyperbackup cannot do simultaneous backups.
This is going to be a little different. As per usual, we need to follow our regular set of steps when dealing with a large amount of data that needs validation.
2. Edit the CSV with the proxy email addresses you want. The format you need is the accountname (samaccountname), and proxyaddresses (SMTP:proxyemail@email.com). Like so below:
samaccountname
proxyaddresses
rick.sanchez
SMTP:rick.sanchez@newproxyaddress.com
rick.richardson
SMTP:rick.richardson@newproxyaddress.com
Codie.Youthead
SMTP:Codie.Youthead@newproxyaddress.com
You NEED that “SMTP:” portion in front, otherwise it won’t take.
A sample output below shows the results of the proxyaddress attributes. Notice how all the different proxies are put together in the same column.
This is OK, and does require some finer tweaking with a CSV editor. However, there’s got to be a way to display each proxy address independently in their own column.
Hostile takeover? All employees of a department being reassigned? We won’t go into ‘how to disable way lots of employees because your upper management said ‘because we told you”. So, we’ll go into changing departments for the entire company.
There’s a few different ways to do this:
Exporting to CSV, making absolutely sure who’s in the list.
Or just changing everyone in one department, and replacing it with an entirely different department name.
Typically, you want option 1. This is fact checking, validation, all that stuff.
In that scenario you follow the same sort of methodolgy:
-Export all users that meet the criteria (in this case, everyone of a certain department) into a CSV file
-Take CSV file, inject into powershell and set new value
There is also a ‘once and done’ approach. Where you can simply replace the values in one string. I don’t suggest this for production environments, namely because there’s always a margin for error.
The ‘typical’ option
Export all users that fit a filter into a CSV file. In this example, I’m looking for all AD users with Department ‘Support’. Exporting to a CSV file. Export something unique, like the samAccountName.
With this exported file “department_users.csv”, we use another piece of code to pick up the CSV file, run a for-loop to go through each user in that CSV file and update their department.
Import-Module Active Directory
$Set_Department = Import-Csv "C:\Path_to_csv\department_users.csv" #Store CSV file into $Set_Department variable
$Department = "Operations Support" #New Department
foreach ($User in $Set_Department) {
#For each name or account in the CSV file $Resetpassword, reset the password with the Set-ADAccountPassword string below
$User.sAMAccountName
Set-ADUser -Identity $User.sAMAccountName -Department $Department
write-output $User
}
Write-Host " Departments changed "
$total = ($Set_Department).count
$total
Write-Host "AD User Departments have been updated..."
Just make sure your end CSV file has a format of only the SamAccountname (like below).
sAMAccountName
Zuzana.Burris
Zandra.Caig
The ‘Once-and-Done’ Option
Again, I don’t suggest this unless you feel absolutely comfortable with the results. If, however you’re in a hurry and need to change all attributes to the new updated attribute, this is the line of code for you.
Get-ADUser -Filter 'Department -like "Old Department Name"' | Set-ADUser -replace @{department="New Department Name"}
As always, you can retrofit this code to suit your needs.
You could also change other AD attributes with this sort of syntax as well, just be sure to change your code, and TEST first.
Showing the Results
Let’s see which AD Accounts by SamAccountName, Department and Title have a specific title. We’ll say anything with a title of “Support”.
If you’re like me, you built a new AD for testing. And if you’re also like me, you imported a whole bunch of users into your AD. Some of those users likely had passwords that didn’t quite meet the domain criteria. If that happened, that means those users are disabled.
In past posts, I wrote about moving users into a different OU. Now, we’re going to change passwords for these users so we can enable them later.
For this, you’ll need a CSV in this format below. I took some liberties and retrieved a large amount of random passwords from manytools.org. There are many websites that can do this, I just like the format that manytools provided. I just pasted the passwords into the second column, the first column being the sAMAccountname of the user.
sAMAccountName
Password
Zuzana.Burris
gz9DndwkBh8s
Zandra.Caig
9eC3bcJ2SzA5
PowerShell code:
Import-Module Active Directory
$Resetpassword = Import-Csv "C:\path_to_username_password_file.csv"
#Store CSV file into $Resetpassword variable
foreach ($User in $Resetpassword) {
#For each name or account in the CSV file $Resetpassword, reset the password with the Set-ADAccountPassword string below
$User.sAMAccountName
$User.Password
Set-ADAccountPassword -Identity $User.sAMAccountName -Reset -NewPassword (ConvertTo-SecureString $User.Password -AsPlainText -force)
}
Write-Host " Passwords changed "
$total = ($Resetpassword).count
$total
Write-Host "Accounts passwords have been reset..."
Showing the Results
Once we’ve set the passwords, we need a way of knowing when or if the passwords were last set for a user. Referring back to the Get-ADuser cmdlet, we can look at the -passwordlastset property.
This is part of my ‘Finding all Disabled users in AD’ from an earlier post. The backstory is, I used some powershell to import about 1100 dummy users into a newly created AD.
Out of 1100 users, 300+ became disabled due to non-compliant passwords (too short, didn’t meet requirements). My end goal was to have all disabled users re-enabled, which meant I had to give them all proper passwords. In the meantime, I decided to create this script to move all disabled users into a separate OU.
2. Export list of disabled users, taking all the unique values (samAccountName) into .CSV Format (done in last post)
3. Retrieving the list with powershell, and moving all the users in the CSV list into another AD OU container
This does of course require a list of users in CSV format, just SamAccountName since each user has as unique value.
like so:
SamAccountName
“Codie.Youthead”
“Bellina.Kobierski”
“Melitta.Marcum”
“Marietta.Caverhill”
Sample CSV file contents
Now the code:
import-module ActiveDirectory
#Store CSV into $Movelist variable
$MoveList = Import-Csv -Path "C:\Path_AD_users_to_move.csv"
#Specify target OU to move users in that CSV file
$TargetOU = "OU=Disabled-Users,OU=contoso,DC=contonso,DC=org"
#Import the data from CSV file and assign it to variable
$Imported_csv = Import-Csv -Path "C:\C:\Path_AD_users_to_move.csv"
$Imported_csv | ForEach-Object {
# Retrieve Distinguised Name of Users
$UserDN = (Get-ADUser -Identity $_.SamAccountName).distinguishedName
Write-Host " Moving Accounts ..... "
# Move user to target OU.
Move-ADObject -Identity $UserDN -TargetPath $TargetOU #-Whatif
}
Write-Host " Completed move "
$total = ($MoveList).count
$total
Write-Host "Accounts have been moved successfully..."
Showing the Results
Typically, Get-ADUser relies on the -DistinguishedName Property. Which really is quite long, and not entirely human readable. Sample code which works, but not in a very pretty manner:
The distinguishename property by itself is a string, separated by a comma “,”. Which means, we can actually split the contents by still using one line of code within powershell. Like so: