Category: Windows

Oracle Scheduler Jobs and Email Notifications vs crontab

Question: How do I run scheduled jobs and get a notification via email?

The above is something that our business does all the time, but the answer was, to do that you’ll have to create a crontab direct job on the server.

I thought surely not.. So I thought about it and with a bit of Googling I figured out that the above wasn’t true, it’s possible to do these as scheduled jobs in SQL Developer.

Create Stored Procedure

  • In SQL Developer create a new Procedure
    • Right click on ‘Procedures’ and select ‘New Procedure’, give it a logical name describing what it does, spaces are not allowed, use underscores.
  • Type in your SQL over where the blank Procedure template says ‘NULL;’

Create Scheduled Job

Or jump straight to creating the job if you want to run a block of SQL which doesn’t suit being a Procedure, you can put the SQL directly into the job.

  • Go to the Schema in your database which has the rights to run a scheduled job
  • Under there expand ‘Scheduler’ and ‘Jobs’, right click on ‘Jobs’ and select ‘New Job (Wizard)…’
  • Fill in the field like below
    • Use logical names and give a full description of what the job is doing
    • Either put the SQL in directly in ‘PL/SQL Block’ or select the Procedure you created

JobWizard

When to execute the job?

  • use the ‘When to Execute Job’ drop-down to select ‘Repeating’ and click the pencil this will make the ‘Repeat Interval’ dialog box appear, shown above
  • Select required time and days for the job to run, click ‘OK’.
    • I selected Daily as well as the actual days, just to be sure 🙂
  • Once happy with your choices click ‘Next >’
  • Set Destination to ‘Local’ for it to run on the server, click ‘Next >’
  • Skip over ‘Job Arguments’ to ‘Notification’
  • Now add in ‘job_succeded’ just whilst you are checking the whether your emails are working when the job ran, add it by holding down Ctrl and clicking it, otherwise you will lose your current default options of : job_broken, job_chain_stalled, job_failed, job_over_max_dur, job_sch_lim_reached
  • Move Date: %event_timestamp% from the body to the bottom, as I’ve noticed that it doesn’t create a carriage return after it so will bunch up all the notification output.
    • Also fix the line Error code: by moving the end % back up a line from Error message. This might be a bug on my version of SQL Developer and will be fixed on yours.

From like this:

Retry count: %retry_count%
Error code: %error_code
%Error message: %error_message%

To like this:

Retry count: %retry_count%
Error code: %error_code%
Error message: %error_message%
  • Now Next through the rest of the setting and click ‘Finish’

How to Set up the email side of things

  • Edit and run the following SQL using your System account
BEGIN 

DBMS_SCHEDULER.set_scheduler_attribute('email_server', 'outlook.blah.co.uk:25');
DBMS_SCHEDULER.set_scheduler_attribute('email_sender', 'noreply@blah.co.uk');

END;

That should be it, all that is left to do is to run your job. You can do that by right clicking the job and selecting ‘Run Job…’

Now when people start to automate jobs, they will be visible to your whole team, rather than hidden away on the server in a crontab.

How to Update an expiring / expired Certificate

This is a process which we have to do every 2 years, so I thought I’d better create a post about it, so I don’t lose it.

  • Create CSR (Certificate Signing Request) using local or server IIS – Other methods are avaliable ie: open SSL
    • Open IIS, click ‘Server Certificates’, click ‘Create Certificate Request…’
    • Upload that CSR to your chosen certificate provider and purchase certificate
  • Once request is approved download the Certificate bundle
  • Extract bundle
  • Complete Certificate in IIS where you created the CSR
    • Open IIS, click ‘Server Certificates’, click ‘Complete Certificate Request…’
  • Export Certificate as PFX
    • Open IIS, click ‘Server Certificates’, Right click certificate and Export
  • Import the PFX to the requires servers
    • Run certlm.msc, Personnel, Certificates, Right click and Import, point at PFX
  • Delete the old Expiring Certificate
    • Run certlm.msc, Personnel, Certificates, Find old certificate based on expiry date and delete.
  • Change IIS site binding for 443 or required secure port to point at the new certificate
    • Open IIS, browse in tree to show site, click ‘Bindings’, find SSL port double click and change Certificate in ‘SSL certificate:’ drop down box.

Test new certificate is working

  • Browse to site in Chrome, right click the Secure / Padlock area click ‘Certificate (Valid)’ check Valid From / To entries.

Moodle alerts using New Relic Query Language NRQL

I’ve been using New Relic to monitor our Moodle installation for the past 4 years. Recently New Relic have added some lovely alerting features which use their NRQL language to trigger any application issues.

With my recent change of job roles (Technology Enhanced Learning Manager) I have now being using NR on a daily basis, trying to get the best out of it.

Whilst reading through the help on NRQL alerting available on NR site, NR blog and in the NR forums I hit upon a few issues.

What if you know your site performs early morning syncs with other systems and these syncs will always trigger your alerts. Something that I didn’t want happening. How can you stop this from happening?

This is how I managed it:

NRQL Database spike alert : ignores 2 to 4:59 am

SELECT average(databaseDuration) * 100 FROM Transaction WHERE hourOf(timestamp) IN ('0:00','1:00','5:00','6:00','7:00','8:00','9:00','10:00','11:00','12:00','13:00','14:00','15:00','16:00','17:00','18:00','19:00','20:00','21:00','22:00','23:00') with TIMEZONE 'Europe/London'

The above is checking average database duration in the Transaction only within the hours 0:00-01:59 skipping 02:00 to 4:59 and checking during 05:00-23:59. The * 100 gives the query value the same representation as what you can see in the APM / Database section.

NRQL Errors > 20% : ignores 2 to 4:59 am

SELECT filter(count(*), WHERE `error.message` is not null)*100 / filter(count(*), WHERE duration is not null) as 'Errors' from Transaction, TransactionError WHERE hourOf(timestamp) IN ('0:00','1:00','5:00','6:00','7:00','8:00','9:00','10:00','11:00','12:00','13:00','14:00','15:00','16:00','17:00','18:00','19:00','20:00','21:00','22:00','23:00') with TIMEZONE 'Europe/London'

The above is converting the query value into the same as what you see when looking at the Error rate on the APM / Overview. This is checking in the Transaction only within the hours 0:00-01:59 skipping 02:00 to 4:59 and checking during 05:00-23:59.

NRQL APDEX < 0.5 for 2 minutes

SELECT apdex(duration, 0.5) FROM Transaction WHERE hourOf(timestamp) IN ('0:00','1:00','2:00','3:00','4:00','5:00','6:00','7:00','8:00','9:00','10:00','11:00','12:00','13:00','14:00','15:00','16:00','17:00','18:00','19:00','20:00','21:00','22:00','23:00') with TIMEZONE 'Europe/London'

The above is converting the query value into the same as what you see when looking at the Apdex score APM / Overview. This is checking in the Transaction within all 24hrs in a day.

I’ve got this set to only trigger if the duration is longer than 2 minutes as this can trigger in the early am. When using the threshold type of static and query value below 0.5 you cannot use ‘hourOf(timestamp) IN’ and skip some hours as those hours will trigger as zero which will throw the alert.

When I first tried setting up these alerts I was using ‘hourOf(timestamp) NOT IN’ rather than ‘hourOf(timestamp) IN’ it was thanks to a NR support person who helped me figure out that ‘NOT IN’ wasn’t working correctly.

Hopefully you can get some use out of the above, please leave any examples in the comments below.

Automate copying IIS logs to SQL Database using Powershell, Log Parser and Task Scheduler

I decided not to go down the route of writing an SSIS package as mentioned in my previous post

So how else can you do what I wanted to achieve?

  • Remove all copied logs from the Database server
  • Copy yesterdays logs from 9 web servers
  • Import the logs to a SQL Database

This is how I did it:

Step 1 : Create 3 Powershell scripts

Copy IIS files from yesterday:

$Date = Get-Date
$Date = $Date.adddays(-1)
$Date2Str = $Date.ToString("yyyMMdd")

$Files1 = gci "\\server1\c$\inetpub\logs\LogFiles\W3SVC2"
ForEach ($File in $Files1){
     $FileDate = $File.creationtime
     $CTDate2Str = $FileDate.ToString("yyyyMMdd")
     if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Logs\\1718\\server1"}
}

$Files2 = gci "\\server2\c$\inetpub\logs\LogFiles\W3SVC2"
ForEach ($File in $Files2){
     $FileDate = $File.creationtime
     $CTDate2Str = $FileDate.ToString("yyyyMMdd")
     if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Logs\\1718\\server2"}
}

$Files3 = gci "\\server3\c$\inetpub\logs\LogFiles\W3SVC2"
ForEach ($File in $Files3){
     $FileDate = $File.creationtime
     $CTDate2Str = $FileDate.ToString("yyyyMMdd")
     if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Logs\\1718\\server3"}
}

Remove all local iis log files:

$ErrorActionPreference = "Stop"

Import-Module Pscx -EA 0

function RemoveLogFiles
{
    Write-Host "Removing log files..."
    Remove-Item ($httpLogPath1)
	Remove-Item ($httpLogPath2)
	Remove-Item ($httpLogPath3)
	Remove-Item ($httpLogPath4)
	Remove-Item ($httpLogPath5)
	Remove-Item ($httpLogPath6)
	Remove-Item ($httpLogPath7)
	Remove-Item ($httpLogPath8)
	Remove-Item ($httpLogPath9)
}

function Main
{	

	[string] $httpLogPath1 = "C:\Logs\1718\server1\*.log"
	[string] $httpLogPath2 = "C:\Logs\1718\server2\*.log"
	[string] $httpLogPath3 = "C:\Logs\1718\server3\*.log"

	[string] $httpLogPath4 = "C:\Logs\1718\server4\W3SVC1\*.log"
	[string] $httpLogPath5 = "C:\Logs\1718\server5\W3SVC1\*.log"
	[string] $httpLogPath6 = "C:\Logs\1718\server6\W3SVC1\*.log"

	[string] $httpLogPath7 = "C:\Logs\1718\server7\W3SVC2\*.log"
	[string] $httpLogPath8 = "C:\Logs\1718\server8\W3SVC2\*.log"
	[string] $httpLogPath9 = "C:\Logs\1718\server9\W3SVC2\*.log"

    RemoveLogFiles 

    Write-Host -Fore Green "Successfully removed log files."
}

Main

Import Website Logs to Database:

$ErrorActionPreference = "Stop"

Import-Module Pscx -EA 0

function ExtractLogFiles(
    [string] $httpLogPath)
{
    If ([string]::IsNullOrEmpty($httpLogPath) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $httpLogArchive = $httpLogPath + "c:\logs\1718\*.log"

    If ((Test-Path $httpLogArchive) -eq $false)
    {
        Write-Host "Creating archive folder for compressed log files..."
        New-Item -ItemType directory -Path $httpLogArchive | Out-Null
    }

    Write-Host "Extracting compressed log files..."

    Get-ChildItem $httpLogPath -Filter "*.zip" |
        ForEach-Object {
            Expand-Archive $_ -OutputPath $httpLogPath

            Move-Item $_.FullName $httpLogArchive
        }
}

function ImportLogFilesServer1(
    [string] $httpLogPath1)
{
    If ([string]::IsNullOrEmpty($httpLogPath1) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe "

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
			+ ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath1"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=;Database=;Trusted_Connection=yes;"

    [string[]] $parameters = @()

	$parameters += $query
    $parameters += "-i:W3C"
	$parameters += "-e:-1"
	#$parameters += "-recurse:-1"
    $parameters += "-o:SQL"
	$parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    & $logParser $parameters
}

function ImportLogFilesServer2(
    [string] $httpLogPath2)
{
    If ([string]::IsNullOrEmpty($httpLogPath2) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe "

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
			+ ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath2"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=;Database=;Trusted_Connection=yes;"

    [string[]] $parameters = @()

	$parameters += $query
    $parameters += "-i:W3C"
	$parameters += "-e:-1"
	#$parameters += "-recurse:-1"
    $parameters += "-o:SQL"
	$parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    & $logParser $parameters
}

function ImportLogFilesServer3(
    [string] $httpLogPath3)
{
    If ([string]::IsNullOrEmpty($httpLogPath3) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe "

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
			+ ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath3"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=;Database=;Trusted_Connection=yes;"

    [string[]] $parameters = @()

	$parameters += $query
    $parameters += "-i:W3C"
	$parameters += "-e:-1"
	#$parameters += "-recurse:-1"
    $parameters += "-o:SQL"
	$parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    & $logParser $parameters
}

function Main
{
	[string] $httpLogPath1 = "C:\Logs\1718\Server1\*.log"
	[string] $httpLogPath2 = "C:\Logs\1718\Server2\*.log"
	[string] $httpLogPath3 = "C:\Logs\1718\Server3\*.log"

#Repeat above for all other server

    ImportLogFilesServer1 $httpLogPath1
    ImportLogFilesServer2 $httpLogPath2
    ImportLogFilesServer3 $httpLogPath3	

#Repeat above for all other server

    Write-Host -Fore Green "Successfully imported log files."
}

Main

Finally you need to setup some scheduled tasks:

There’s a few little bits here which wouldn’t work for me without the following arguments:

Powershell.exe in Program/script:

-ExecutionPolicy Bypass -file “C:\powershell iis scripts\remove all iis log files.ps1” in Add arguments (optional):

task

The ‘Copy IIS files from yesterday’ script required some work with our server and storage team to add a service account to be able to run batch scripts without being logged on, this is in the Local Group Policy, This was due to being on a domain and the policy was set to not allow me to set this for my user. More details to come on this, its time consuming to explain in great detail.

 

 

 

IIS Logs to SQL Database using Powershell and Log Parser

Our IIS logs are ridiculously large, every time I am asked to find out some stats it takes me ages copying the files to my machine and running some log parser SQL on it.

So I thought it’s time to insert them into a SQL Database. I adapted a script which I mostly found here and used the following to do the import. At first I was getting the following error which I couldn’t figure out how to solve SQL table column “csUsername” date type is not compatible with SELECT clause item “csUsername” (type INTEGER)

PowershellSqlerror

This was being caused because some of the log files had the username as +12345678 or 12345678+ therefore no longer being an INTEGER and not being able to be imported.

To fix this look at line 27 of the following Powershell script to force csUsername to be a VARCHAR rather than an INTEGER

$ErrorActionPreference = "Stop"

Import-Module Pscx -EA 0

function ImportLogFiles(
    [string] $httpLogPath)
{
    If ([string]::IsNullOrEmpty($httpLogPath) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe"

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
            + ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=&amp;amp;lt;Database Server&amp;amp;gt;;Database=&amp;amp;lt;Database&amp;amp;gt;;Trusted_Connection=yes;"

    [string[]] $parameters = @()

    $parameters += $query
    $parameters += "-i:W3C"
    $parameters += "-e:-1"
    $parameters += "-o:SQL"
    $parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    &amp;amp;amp; $logParser $parameters
}

function Main
{
    $httpLogPath = "&amp;amp;lt;Path to your logs&amp;amp;gt;\*.log"

    ImportLogFiles $httpLogPath

    Write-Host -Fore Green "Successfully imported log files."
}

Main

Now I just need to write an SSIS package which will update the database with the IIS log files from 3 server and two sites (daily), so I’ll be back to update you on how I got along.

Okay Automate copying IIS logs to SQL Database using Powershell, Log Parser and Task Scheduler post is now written, continue reading to find out how

Writable web directory in IIS 7.5

Recently I followed the advice re this topic on Stack Overflow, this had me add the IIS AppPool\DefaultAppPool or AppPool for your website to the security permission of the folder you wanted to write to.

This didn’t work for me.

What did work was when I allowed the local server Users group the access to modify the folder.

user-permissions

Migrating Oracle VB.net application from Windows 2003 to 2012

Lessons learnt whilst migrating a VB.net application from a Windows 2003 server to a Load balanced Windows 2012 server stack. Unfortunately it wasn’t as easy as deploying the old code on a new version of IIS.

One of the first issue that I had to solve was that the Oracle System.Data.OracleClient is now deprecated, therefore I needed to use oracle managed data access client Oracle.DataAccess.Client in order to connect to an Oracle DB using IIS 8.5

My web.config now looks something like this:

No longer available

The next task was to figure out where the oracle database would randomly drop its connection, I did this by debugging the code to VS2017,  I then had to open oracle datareader at several points, although the connection remained open at that same point when using the old oracle client.

A potentially dangerous Request.QueryString value was detected from the client. The application in question was creating one parameter in a report URL which was being formed using the ID from radio buttons selected. The HTML which was formed for the page changed between deploying and running the code on 2003 to how it appeared when it was being ran on a 2012 server. On the new server I was seeing that the parameter was passing in the wrong part of the HTML. Therefore it put in **** as a URL parameter. If at any point IIS or your browser sees this type of behavior it throws the above error.

How to fix this?

Two ways the easy way and the hard.

The easy way, which is not recommended

insert ValidationRequest=false into the page header for each ASP page as well as like this in your web.config, this shuts off the security feature and will leave your site open to scripting attacks:

No longer available

The hard way, figure out why the code is inserting a piece of HTML in as a parameter within a URL. It was doing this in my code due to the way that the HTML removed spaces, my code needed to determine where it was to grab the id of the radio button so was trying to locate a set point by reading in the ID of the radio button, because that was after value=2 > and it was looking for ‘value=’ + ID + ” “. Which it couldn’t find. as that was now written as value=2>.

But this won’t mean anything to you, as your code won’t be throwing an error because of the same reasons as above! Or at least for your sake I hope not!!