Category: Web

stuff about web design, development, trends, and tricks

  • Migrating Unifi Controller

    I have a VM at home that runs Windows 2008 and I put the Ubiquiti controller on it and stood up first one and then another AP to provide wireless all over my house.

    For obvious reasons I need to move away from Windows 2008 and looking at the resources on my host, thought moving the controller to a small Linux server. It would be dedicated to the controller and not any other tasks like I was using Windows for. This also allows me to learn a little more about Linux, so two birds, one stone.

    First I looked at getting an appliance. But this didn’t allow me the learning opportunity I wanted.

    Next, I tried about 5 different distros and some were extremely fast and lightweight! Bodhi linux was suggested as small and light, still with a GUI, but it was a little to odd for me to get working. Second I tried Lubuntu, again with a lightweight footprint and GUI. Again I couldn’t get all the parts working as desired. The fastest response I got was using TinyCore linux. A minimal GUI and the way applications are used threw me off.

    In the end, I just went with server version of Ubuntu. There were more distro’s suggested but all used the server install underneath; I might as well go to the source! I just had to figure out YAML for the networking and I was off and running.

    Unfortunately, I was still missing something to get the controller software downloaded and operational. Perhaps it was something I was mis-translating from doing this on a Raspberry Pi. I found a middle ground where there was a script to do most of the work, but it also was missing some parts.

    Finally! I found this script (thanks Glenn!) that covered all the needed parts and matched my distro! Ran this and my new controller came right up! Now the step of migrating the AP from one controller to the next.

    In the end, you just restore a backup of the old controller, after turning it off. The new controller picks up the AP and all other settings. Works like a dream!

    Hopefully this helps you too.

  • parse array for valid usernames

    I was helping a colleague with a list of email addresses and trying to get a valid username out of it. He is going to get a list of emails every week so writing a script to parse it out for him seemed the best idea. This is only part of the process; there are steps to actually create the account if not already existing and more. So this is mainly an example of text parsing to get the info you want based on a few simple rules.

    clear
    $emails = "[email protected]", "[email protected]", "[email protected]", "[email protected]", "[email protected]", "Perumel.Aranin.Senthenekrishnin"
    $usernames = @()
    
    function Get-Username() {
    <#
    .SYNOPSIS
     takes a given email address and makes a username based on specific rules
    .DESCRIPTION
     Converts a given email address to a username. 
     The username cannot be more than 20 char long, if the first and last name combined are too long only first initial is used
     If the email has a number in it, it is retained in the new username
    .PARAMETER
     -email: the email to work with
     -userName: if the email is not parsable, a username can be given to use instead (or in addition to)
    .EXAMPLE
     Get-Username -email "[email protected]" -userName "jchinkes"
    .NOTES
    #>
     Param(
     [Parameter(Mandatory=$True)][string]$email,
     [string]$userName
     )
     $fname = $null
     $lname = $null
     $nameNum = $null
     try {
     #split the email (do we need a test for valid email address?)
     $splitString1 = $email.split('@')
     #take the username of email and split on dot
     $splitString2 = $splitString1[0].split('.')
     
     if($splitString2.Count -le 1){
     #error checking if there is no first or last name
     #is there a better test?
     #use $userName here somehow
     }
     
     if($splitString2.Count -ge 3){
     #more than three items in the array
     if($splitString2[2] -match "^[\d\.]+$"){
     #the third item is a number
     $nameNum = $splitString2[2]
     $lname = $splitString2[1]
     }
     else {
     #third item is a name
     $lname = $splitString2[2]
     }
     }
     else {
     #two part user name
     $lname = $splitString2[1]
     }
     $fname = $splitString2[0]
     
     if($nameNum -ne $null){
     #eval name to see if needs trimming
     if ($fname.get_Length() + $lname.get_Length() -gt 18){
     #trim fname and keep lname
     $fname = $fname.Substring(0,1)
     if ($lname.get_Length() -gt 16){
     #need to trim lname too!
     $lname = $lname.Substring(0,16)
     }
     }
     return "$fname.$lname.$nameNum"
     }
     else {
     if ($fname.get_Length() + $lname.get_Length() -gt 20){
     #trim fname and keep lname
     $fname = $fname.Substring(0,1)
     if ($lname.get_Length() -gt 18){
     #need to trim lname too!
     $lname = $lname.Substring(0,18)
     }
     }
     return "$fname.$lname"
     }
     }
     catch {
     "There was an error- $($_.Exception.Message)"
     }
    }
    
    
    foreach($email in $emails){
     $usernames += Get-Username -email $email
    }
    
    $usernames
  • Make Link List with PowerShell

    A script to get a list of links off a given page. I found these two examples to start from: http://normansolutions.co.uk/post/using-powershell-to-check-all-pages-in-websitehttps://www.petri.com/testing-uris-urls-powershell.

    clear
    function get-URLlist($testPage, [int]$linkDepth){
        #set the counter
        [int]$i = 0     #loop thru the list of links, going to sub-pages until the counter equals the given link depth
        while($linkDepth -gt $i) {
            foreach($link in $testPage.Links) {
                #only look at links on the same site
                if($link.href.ToString() -notlike "http*") {
                    #if not already in the array, add it
                    if(!($sitemap.ContainsKey($link.innerText))){
                        $sitemap.add($link.innerText, $url + "/" + $link.href)
                        $testPage = Invoke-WebRequest "http://$url"
                        #check out the sub-pages one less level than the given link depth
                        $sitemap = get-URLlist -testPage $testPage -linkDepth $($linkDepth-1)
                }
            }
        }
        $i++
    }
    return $sitemap
    } try{
        #set your domain and the array to hold the links
        $url = "www.domain.com"
        $sitemap = @{}     #read the page
        $testPage = Invoke-WebRequest "http://$url"
        #get all the links into the array
        $sitemap = get-URLlist -testPage $testPage -linkDepth 5
        $sitemap
    }
    catch{
        "There was an error- `r`n$($_.Exception.Message)"
    }

    I wrote this with an understanding of what the links look like on this page, so the function formats them with the given URL.