Category: Windows

Windows 10 Update October 2018 / 1809 : No browser website access?

I installed Windows 10 1809 update from here

Bypassing my domain group policy which has greyed out the ‘check for updates’ button. This is slightly annoying in itself.

After installation was completed, I noted that random things were happening when browsing the internet, this was on all the web browsers that I had installed (Chrome, Edge, Firefox). Things like no pages being displayed to browsers working for the first 5 minutes since a fresh reboot, to Chrome crashing and refusing to open again that session. Odd! so I checked whether I could do a

nslookup google.com

From the command prompt.

This was still working as expected. So the network card and settings weren’t the cause of my strife.

I had already upgrade two other machines without any issues, so what was going on? The only difference to the machines were the working ones were laptops using WiFi, the broken one was a desktop using a connected ethernet card.

After a day and a half tweaking / reading the internet, the solution lied with Malwarebytes ! I removed the installation and everything started working again. The installation had recently upgraded itself.

I’ve no idea why and frankly don’t care as it caused me so much hassle to figure out what was going on.

I’ve still got Malwarebytes installed on the two other machines and they are still working as expected, so there must be a buggy version out there that halts the internet access or it detected malware and shut the internet down when using the 1809 update.

Hope this blog post helps someone else too.

Oracle Scheduler Jobs and Email Notifications vs crontab

Question: How do I run scheduled jobs and get a notification via email?

The above is something that our business does all the time, but the answer was, to do that you’ll have to create a crontab direct job on the server.

I thought surely not.. So I thought about it and with a bit of Googling I figured out that the above wasn’t true, it’s possible to do these as scheduled jobs in SQL Developer.

Create Stored Procedure

  • In SQL Developer create a new Procedure
    • Right click on ‘Procedures’ and select ‘New Procedure’, give it a logical name describing what it does, spaces are not allowed, use underscores.
  • Type in your SQL over where the blank Procedure template says ‘NULL;’

Create Scheduled Job

Or jump straight to creating the job if you want to run a block of SQL which doesn’t suit being a Procedure, you can put the SQL directly into the job.

  • Go to the Schema in your database which has the rights to run a scheduled job
  • Under there expand ‘Scheduler’ and ‘Jobs’, right click on ‘Jobs’ and select ‘New Job (Wizard)…’
  • Fill in the field like below
    • Use logical names and give a full description of what the job is doing
    • Either put the SQL in directly in ‘PL/SQL Block’ or select the Procedure you created

JobWizard

When to execute the job?

  • use the ‘When to Execute Job’ drop-down to select ‘Repeating’ and click the pencil this will make the ‘Repeat Interval’ dialog box appear, shown above
  • Select required time and days for the job to run, click ‘OK’.
    • I selected Daily as well as the actual days, just to be sure 🙂
  • Once happy with your choices click ‘Next >’
  • Set Destination to ‘Local’ for it to run on the server, click ‘Next >’
  • Skip over ‘Job Arguments’ to ‘Notification’
  • Now add in ‘job_succeded’ just whilst you are checking the whether your emails are working when the job ran, add it by holding down Ctrl and clicking it, otherwise you will lose your current default options of : job_broken, job_chain_stalled, job_failed, job_over_max_dur, job_sch_lim_reached
  • Move Date: %event_timestamp% from the body to the bottom, as I’ve noticed that it doesn’t create a carriage return after it so will bunch up all the notification output.
    • Also fix the line Error code: by moving the end % back up a line from Error message. This might be a bug on my version of SQL Developer and will be fixed on yours.

From like this:

Retry count: %retry_count%
Error code: %error_code
%Error message: %error_message%

To like this:

Retry count: %retry_count%
Error code: %error_code%
Error message: %error_message%
  • Now Next through the rest of the setting and click ‘Finish’

How to Set up the email side of things

  • Edit and run the following SQL using your System account
BEGIN 

DBMS_SCHEDULER.set_scheduler_attribute('email_server', 'outlook.blah.co.uk:25');
DBMS_SCHEDULER.set_scheduler_attribute('email_sender', 'noreply@blah.co.uk');

END;

That should be it, all that is left to do is to run your job. You can do that by right clicking the job and selecting ‘Run Job…’

Now when people start to automate jobs, they will be visible to your whole team, rather than hidden away on the server in a crontab.

How to Update an expiring / expired Certificate

This is a process which we have to do every 2 years, so I thought I’d better create a post about it, so I don’t lose it.

  • Create CSR (Certificate Signing Request) using local or server IIS – Other methods are avaliable ie: open SSL
    • Open IIS, click ‘Server Certificates’, click ‘Create Certificate Request…’
    • Upload that CSR to your chosen certificate provider and purchase certificate
  • Once request is approved download the Certificate bundle
  • Extract bundle
  • Complete Certificate in IIS where you created the CSR
    • Open IIS, click ‘Server Certificates’, click ‘Complete Certificate Request…’
  • Export Certificate as PFX
    • Open IIS, click ‘Server Certificates’, Right click certificate and Export
  • Import the PFX to the requires servers
    • Run certlm.msc, Personnel, Certificates, Right click and Import, point at PFX
  • Delete the old Expiring Certificate
    • Run certlm.msc, Personnel, Certificates, Find old certificate based on expiry date and delete.
  • Change IIS site binding for 443 or required secure port to point at the new certificate
    • Open IIS, browse in tree to show site, click ‘Bindings’, find SSL port double click and change Certificate in ‘SSL certificate:’ drop down box.

Test new certificate is working

  • Browse to site in Chrome, right click the Secure / Padlock area click ‘Certificate (Valid)’ check Valid From / To entries.

Moodle alerts using New Relic Query Language NRQL

I’ve been using New Relic to monitor our Moodle installation for the past 4 years. Recently New Relic have added some lovely alerting features which use their NRQL language to trigger any application issues.

With my recent change of job roles (Technology Enhanced Learning Manager) I have now being using NR on a daily basis, trying to get the best out of it.

Whilst reading through the help on NRQL alerting available on NR site, NR blog and in the NR forums I hit upon a few issues.

What if you know your site performs early morning syncs with other systems and these syncs will always trigger your alerts. Something that I didn’t want happening. How can you stop this from happening?

This is how I managed it:

NRQL Database spike alert : ignores 2 to 4:59 am

SELECT average(databaseDuration) * 100 FROM Transaction WHERE hourOf(timestamp) IN ('0:00','1:00','5:00','6:00','7:00','8:00','9:00','10:00','11:00','12:00','13:00','14:00','15:00','16:00','17:00','18:00','19:00','20:00','21:00','22:00','23:00') with TIMEZONE 'Europe/London'

The above is checking average database duration in the Transaction only within the hours 0:00-01:59 skipping 02:00 to 4:59 and checking during 05:00-23:59. The * 100 gives the query value the same representation as what you can see in the APM / Database section.

NRQL Errors > 20% : ignores 2 to 4:59 am

SELECT filter(count(*), WHERE `error.message` is not null)*100 / filter(count(*), WHERE duration is not null) as 'Errors' from Transaction, TransactionError WHERE hourOf(timestamp) IN ('0:00','1:00','5:00','6:00','7:00','8:00','9:00','10:00','11:00','12:00','13:00','14:00','15:00','16:00','17:00','18:00','19:00','20:00','21:00','22:00','23:00') with TIMEZONE 'Europe/London'

The above is converting the query value into the same as what you see when looking at the Error rate on the APM / Overview. This is checking in the Transaction only within the hours 0:00-01:59 skipping 02:00 to 4:59 and checking during 05:00-23:59.

NRQL APDEX < 0.5 for 2 minutes

SELECT apdex(duration, 0.5) FROM Transaction WHERE hourOf(timestamp) IN ('0:00','1:00','2:00','3:00','4:00','5:00','6:00','7:00','8:00','9:00','10:00','11:00','12:00','13:00','14:00','15:00','16:00','17:00','18:00','19:00','20:00','21:00','22:00','23:00') with TIMEZONE 'Europe/London'

The above is converting the query value into the same as what you see when looking at the Apdex score APM / Overview. This is checking in the Transaction within all 24hrs in a day.

I’ve got this set to only trigger if the duration is longer than 2 minutes as this can trigger in the early am. When using the threshold type of static and query value below 0.5 you cannot use ‘hourOf(timestamp) IN’ and skip some hours as those hours will trigger as zero which will throw the alert.

When I first tried setting up these alerts I was using ‘hourOf(timestamp) NOT IN’ rather than ‘hourOf(timestamp) IN’ it was thanks to a NR support person who helped me figure out that ‘NOT IN’ wasn’t working correctly.

Hopefully you can get some use out of the above, please leave any examples in the comments below.

Automate copying IIS logs to SQL Database using Powershell, Log Parser and Task Scheduler

I decided not to go down the route of writing an SSIS package as mentioned in my previous post

So how else can you do what I wanted to achieve?

  • Remove all copied logs from the Database server
  • Copy yesterdays logs from 9 web servers
  • Import the logs to a SQL Database

This is how I did it:

Step 1 : Create 3 Powershell scripts

Copy IIS files from yesterday:

$Date = Get-Date
$Date = $Date.adddays(-1)
$Date2Str = $Date.ToString("yyyMMdd")

$Files1 = gci "\\server1\c$\inetpub\logs\LogFiles\W3SVC2"
ForEach ($File in $Files1){
     $FileDate = $File.creationtime
     $CTDate2Str = $FileDate.ToString("yyyyMMdd")
     if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Logs\\1718\\server1"}
}

$Files2 = gci "\\server2\c$\inetpub\logs\LogFiles\W3SVC2"
ForEach ($File in $Files2){
     $FileDate = $File.creationtime
     $CTDate2Str = $FileDate.ToString("yyyyMMdd")
     if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Logs\\1718\\server2"}
}

$Files3 = gci "\\server3\c$\inetpub\logs\LogFiles\W3SVC2"
ForEach ($File in $Files3){
     $FileDate = $File.creationtime
     $CTDate2Str = $FileDate.ToString("yyyyMMdd")
     if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Logs\\1718\\server3"}
}

Remove all local iis log files:

$ErrorActionPreference = "Stop"

Import-Module Pscx -EA 0

function RemoveLogFiles
{
    Write-Host "Removing log files..."
    Remove-Item ($httpLogPath1)
	Remove-Item ($httpLogPath2)
	Remove-Item ($httpLogPath3)
	Remove-Item ($httpLogPath4)
	Remove-Item ($httpLogPath5)
	Remove-Item ($httpLogPath6)
	Remove-Item ($httpLogPath7)
	Remove-Item ($httpLogPath8)
	Remove-Item ($httpLogPath9)
}

function Main
{	

	[string] $httpLogPath1 = "C:\Logs\1718\server1\*.log"
	[string] $httpLogPath2 = "C:\Logs\1718\server2\*.log"
	[string] $httpLogPath3 = "C:\Logs\1718\server3\*.log"

	[string] $httpLogPath4 = "C:\Logs\1718\server4\W3SVC1\*.log"
	[string] $httpLogPath5 = "C:\Logs\1718\server5\W3SVC1\*.log"
	[string] $httpLogPath6 = "C:\Logs\1718\server6\W3SVC1\*.log"

	[string] $httpLogPath7 = "C:\Logs\1718\server7\W3SVC2\*.log"
	[string] $httpLogPath8 = "C:\Logs\1718\server8\W3SVC2\*.log"
	[string] $httpLogPath9 = "C:\Logs\1718\server9\W3SVC2\*.log"

    RemoveLogFiles 

    Write-Host -Fore Green "Successfully removed log files."
}

Main

Import Website Logs to Database:

$ErrorActionPreference = "Stop"

Import-Module Pscx -EA 0

function ExtractLogFiles(
    [string] $httpLogPath)
{
    If ([string]::IsNullOrEmpty($httpLogPath) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $httpLogArchive = $httpLogPath + "c:\logs\1718\*.log"

    If ((Test-Path $httpLogArchive) -eq $false)
    {
        Write-Host "Creating archive folder for compressed log files..."
        New-Item -ItemType directory -Path $httpLogArchive | Out-Null
    }

    Write-Host "Extracting compressed log files..."

    Get-ChildItem $httpLogPath -Filter "*.zip" |
        ForEach-Object {
            Expand-Archive $_ -OutputPath $httpLogPath

            Move-Item $_.FullName $httpLogArchive
        }
}

function ImportLogFilesServer1(
    [string] $httpLogPath1)
{
    If ([string]::IsNullOrEmpty($httpLogPath1) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe "

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
			+ ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath1"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=;Database=;Trusted_Connection=yes;"

    [string[]] $parameters = @()

	$parameters += $query
    $parameters += "-i:W3C"
	$parameters += "-e:-1"
	#$parameters += "-recurse:-1"
    $parameters += "-o:SQL"
	$parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    & $logParser $parameters
}

function ImportLogFilesServer2(
    [string] $httpLogPath2)
{
    If ([string]::IsNullOrEmpty($httpLogPath2) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe "

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
			+ ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath2"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=;Database=;Trusted_Connection=yes;"

    [string[]] $parameters = @()

	$parameters += $query
    $parameters += "-i:W3C"
	$parameters += "-e:-1"
	#$parameters += "-recurse:-1"
    $parameters += "-o:SQL"
	$parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    & $logParser $parameters
}

function ImportLogFilesServer3(
    [string] $httpLogPath3)
{
    If ([string]::IsNullOrEmpty($httpLogPath3) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe "

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
			+ ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath3"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=;Database=;Trusted_Connection=yes;"

    [string[]] $parameters = @()

	$parameters += $query
    $parameters += "-i:W3C"
	$parameters += "-e:-1"
	#$parameters += "-recurse:-1"
    $parameters += "-o:SQL"
	$parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    & $logParser $parameters
}

function Main
{
	[string] $httpLogPath1 = "C:\Logs\1718\Server1\*.log"
	[string] $httpLogPath2 = "C:\Logs\1718\Server2\*.log"
	[string] $httpLogPath3 = "C:\Logs\1718\Server3\*.log"

#Repeat above for all other server

    ImportLogFilesServer1 $httpLogPath1
    ImportLogFilesServer2 $httpLogPath2
    ImportLogFilesServer3 $httpLogPath3	

#Repeat above for all other server

    Write-Host -Fore Green "Successfully imported log files."
}

Main

Finally you need to setup some scheduled tasks:

There’s a few little bits here which wouldn’t work for me without the following arguments:

Powershell.exe in Program/script:

-ExecutionPolicy Bypass -file “C:\powershell iis scripts\remove all iis log files.ps1” in Add arguments (optional):

task

The ‘Copy IIS files from yesterday’ script required some work with our server and storage team to add a service account to be able to run batch scripts without being logged on, this is in the Local Group Policy, This was due to being on a domain and the policy was set to not allow me to set this for my user. More details to come on this, its time consuming to explain in great detail.

 

 

 

IIS Logs to SQL Database using Powershell and Log Parser

Our IIS logs are ridiculously large, every time I am asked to find out some stats it takes me ages copying the files to my machine and running some log parser SQL on it.

So I thought it’s time to insert them into a SQL Database. I adapted a script which I mostly found here and used the following to do the import. At first I was getting the following error which I couldn’t figure out how to solve SQL table column “csUsername” date type is not compatible with SELECT clause item “csUsername” (type INTEGER)

PowershellSqlerror

This was being caused because some of the log files had the username as +12345678 or 12345678+ therefore no longer being an INTEGER and not being able to be imported.

To fix this look at line 27 of the following Powershell script to force csUsername to be a VARCHAR rather than an INTEGER

$ErrorActionPreference = "Stop"

Import-Module Pscx -EA 0

function ImportLogFiles(
    [string] $httpLogPath)
{
    If ([string]::IsNullOrEmpty($httpLogPath) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe"

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
            + ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=&amp;amp;lt;Database Server&amp;amp;gt;;Database=&amp;amp;lt;Database&amp;amp;gt;;Trusted_Connection=yes;"

    [string[]] $parameters = @()

    $parameters += $query
    $parameters += "-i:W3C"
    $parameters += "-e:-1"
    $parameters += "-o:SQL"
    $parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    &amp;amp;amp; $logParser $parameters
}

function Main
{
    $httpLogPath = "&amp;amp;lt;Path to your logs&amp;amp;gt;\*.log"

    ImportLogFiles $httpLogPath

    Write-Host -Fore Green "Successfully imported log files."
}

Main

Now I just need to write an SSIS package which will update the database with the IIS log files from 3 server and two sites (daily), so I’ll be back to update you on how I got along.

Okay Automate copying IIS logs to SQL Database using Powershell, Log Parser and Task Scheduler post is now written, continue reading to find out how

Writable web directory in IIS 7.5

Recently I followed the advice re this topic on Stack Overflow, this had me add the IIS AppPool\DefaultAppPool or AppPool for your website to the security permission of the folder you wanted to write to.

This didn’t work for me.

What did work was when I allowed the local server Users group the access to modify the folder.

user-permissions