Category: Web

stuff about web design, development, trends, and tricks

  • parse array for valid usernames

    I was helping a colleague with a list of email addresses and trying to get a valid username out of it. He is going to get a list of emails every week so writing a script to parse it out for him seemed the best idea. This is only part of the process; there are steps to actually create the account if not already existing and more. So this is mainly an example of text parsing to get the info you want based on a few simple rules.

    clear
    $emails = "[email protected]", "[email protected]", "[email protected]", "[email protected]", "[email protected]", "Perumel.Aranin.Senthenekrishnin"
    $usernames = @()
    
    function Get-Username() {
    <#
    .SYNOPSIS
     takes a given email address and makes a username based on specific rules
    .DESCRIPTION
     Converts a given email address to a username. 
     The username cannot be more than 20 char long, if the first and last name combined are too long only first initial is used
     If the email has a number in it, it is retained in the new username
    .PARAMETER
     -email: the email to work with
     -userName: if the email is not parsable, a username can be given to use instead (or in addition to)
    .EXAMPLE
     Get-Username -email "[email protected]" -userName "jchinkes"
    .NOTES
    #>
     Param(
     [Parameter(Mandatory=$True)][string]$email,
     [string]$userName
     )
     $fname = $null
     $lname = $null
     $nameNum = $null
     try {
     #split the email (do we need a test for valid email address?)
     $splitString1 = $email.split('@')
     #take the username of email and split on dot
     $splitString2 = $splitString1[0].split('.')
     
     if($splitString2.Count -le 1){
     #error checking if there is no first or last name
     #is there a better test?
     #use $userName here somehow
     }
     
     if($splitString2.Count -ge 3){
     #more than three items in the array
     if($splitString2[2] -match "^[\d\.]+$"){
     #the third item is a number
     $nameNum = $splitString2[2]
     $lname = $splitString2[1]
     }
     else {
     #third item is a name
     $lname = $splitString2[2]
     }
     }
     else {
     #two part user name
     $lname = $splitString2[1]
     }
     $fname = $splitString2[0]
     
     if($nameNum -ne $null){
     #eval name to see if needs trimming
     if ($fname.get_Length() + $lname.get_Length() -gt 18){
     #trim fname and keep lname
     $fname = $fname.Substring(0,1)
     if ($lname.get_Length() -gt 16){
     #need to trim lname too!
     $lname = $lname.Substring(0,16)
     }
     }
     return "$fname.$lname.$nameNum"
     }
     else {
     if ($fname.get_Length() + $lname.get_Length() -gt 20){
     #trim fname and keep lname
     $fname = $fname.Substring(0,1)
     if ($lname.get_Length() -gt 18){
     #need to trim lname too!
     $lname = $lname.Substring(0,18)
     }
     }
     return "$fname.$lname"
     }
     }
     catch {
     "There was an error- $($_.Exception.Message)"
     }
    }
    
    
    foreach($email in $emails){
     $usernames += Get-Username -email $email
    }
    
    $usernames
  • Make Link List with PowerShell

    A script to get a list of links off a given page. I found these two examples to start from: http://normansolutions.co.uk/post/using-powershell-to-check-all-pages-in-websitehttps://www.petri.com/testing-uris-urls-powershell.

    clear
    function get-URLlist($testPage, [int]$linkDepth){
        #set the counter
        [int]$i = 0     #loop thru the list of links, going to sub-pages until the counter equals the given link depth
        while($linkDepth -gt $i) {
            foreach($link in $testPage.Links) {
                #only look at links on the same site
                if($link.href.ToString() -notlike "http*") {
                    #if not already in the array, add it
                    if(!($sitemap.ContainsKey($link.innerText))){
                        $sitemap.add($link.innerText, $url + "/" + $link.href)
                        $testPage = Invoke-WebRequest "http://$url"
                        #check out the sub-pages one less level than the given link depth
                        $sitemap = get-URLlist -testPage $testPage -linkDepth $($linkDepth-1)
                }
            }
        }
        $i++
    }
    return $sitemap
    } try{
        #set your domain and the array to hold the links
        $url = "www.domain.com"
        $sitemap = @{}     #read the page
        $testPage = Invoke-WebRequest "http://$url"
        #get all the links into the array
        $sitemap = get-URLlist -testPage $testPage -linkDepth 5
        $sitemap
    }
    catch{
        "There was an error- `r`n$($_.Exception.Message)"
    }

    I wrote this with an understanding of what the links look like on this page, so the function formats them with the given URL.

  • Using Google domains with a dynamic IP

    This is a script to read the current public IP and update your Google domain with the new IP-

    #get the current IP
    $url = "https://ifconfig.co/json"
    $ip = Invoke-RestMethod $url
    #send the new IP to Google
    $updateCommand = "https://username:[email protected]/nic/update?hostname=dynamic.domain.com&myip=$($ip.ip)"
    Invoke-RestMethod $updateCommand

    Here is some documentation on Google on how to change your DNS via API- https://support.google.com/domains/answer/6147083. The short version is make a domain or subdomain dynamic. When you do this there will be a username and password generated for you, use this in the above script.