Thursday, July 28, 2016

Geektool for BBC News

I've been using Geektool on and off for some time now for my Desktop needs. Here is a quick script that will show top news from various sources from the BBC. Adjust accordingly...

#!/bin/sh

URL="http://feeds.bbci.co.uk/news/world/rss.xml"
#URL="http://feeds.bbci.co.uk/news/technology/rss.xml"
#URL="http://feeds.bbci.co.uk/news/world/us_and_canada/rss.xml"

if [ $# -eq 1 ] ; then
  headarg=$(( $1 * 2 ))
else
  headarg="-8"
fi

/bin/date
echo ""
echo "      BBC World News"
echo "      **************"
curl --silent "$URL" | grep -E '(title>|description>)' | \
  sed -n '4,$p' | \
  sed -e 's/<title>//' -e 's/<\/title>//' -e 's/<description>/   /' \
      -e 's/<\/description>//' | \
  sed -e 's/<!\[CDATA\[//g' |           
  sed -e 's/\]\]>//g' |        
  sed -e 's/<[^>]*>//g' |     
  head $headarg | sed G | fmt

Monday, July 25, 2016

The lovely issue of rogue Crontab emails

We've all dealt with it from time to time. Just about everyone in a system administration position has logged in to work email one morning (hopefully not on a Monday) to be greeted with thousands upon thousands of nearly identical emails from an automated task or cron that has gone awry at some point during the night.

Many people already have special email rules or filters in place that move this unwanted legitimate spam to a special folder to be handled at another time but some people would rather be aware of when these issue take place and be able to stop them from sending GB of email potentially crippling any email system.

Here's how it typically happens, at least from what I've seen. The local "administrator" or "root" account on many systems is forwarded to an account that is either "monitored by" or "forwarded to" any number of system administrators. When a system event or error takes place, an email is generated and sent to that address. This is sent (by design) to all of the people who are in charge of that system.

This works well and is typically an accepted method for recognizing issues with systems. When things go bad however, it can be a horrible mess. What I have found over the years is that this is done on such a mass scale that many people start out just ignoring and deleting these emails since much of the time they are either so cryptic that they can't be understood or they are from a system that the person is just not responsible for. Eventually a mail rule is set up and all of them go to the trash or an unmonitored folder to be mass-deleted at another time. So, this might solve the issue, right? Wrong. What tends to happen is the mail system takes a huge hit and people without this kind of mail rule end up taking the brunt of the heat. Not to mention, the issue at hand never gets addressed. Sometimes these messages can be sent more than once a minute. When a crontab is running a task every minute and it generates multiple errors. Let's say it sends 5 emails each minute to the local administrator account and that is forwarded to a distribution list containing 20 people. If each email is just a few hundred kilobytes, we are up to 30MB/minute already (and that's being very conservative about the situation). I think that's close to 43GB/day of unwanted/unneeded email. Also keep in mind that this is just from one system.

So please, please. Don't set up mail rules that move or even delete these automated messages. Just figure out what is going on and fix the problem. Okay, I'm off my soap-box and back to pages of emails (I didn't even touch on the issue of wasted staff time).

Cleaning remote temp files with PowerShell

For some reason c:\windows\temp seems to be filling up with Windows Updates on a lot of our servers. Here is a quick Invoke-Command solution.
Invoke-Command -ComputerName $computername -ScriptBlock {Remove-Item c:\windows\temp\* -Force -Recurse}
Invoke-Command -ComputerName $computername -ScriptBlock {Get-PSDrive c}

Wednesday, May 4, 2016

Deleting remote files via powershell

Here is a handy function useful in many scripts.

function delete-remotefile {
    PROCESS {
                $file = "\\$_\c$\temp\temp.txt"
                if (test-path $file)
                {
                echo "$_ File Exists"
                Remove-Item $file -force
                echo "$_ File Deleted"
                }
            }