Category: Windows

AppDynamics upgrade / Glassfish Master password / Keystore issues?

We were having real problems upgrading AppDynamics from 4.2 to 4.3 then on to 4.5. All of the issues where caused by things that were not mentioned in their documentation.

Hopefully this information might prove to be useful to someone else in the same situation.

Our main issue was been caused because we had a custom password securing our Glassfish Master / Keystore & Keystore Keys, all of which was the same, as they need to be, because of this the installer was failing as it doesn’t ask you to insert your password as a parameter during installation, you need to change it back to the default of ‘changeit‘ before commencing an upgrade when we tried to get asadmin to do this for us, we got the error:

asadmin change-master-password --savemasterpassword=true
Enter the current master password>
Enter the new master password>
Enter the new master password again>
Keystore was tampered with, or password was incorrect
Command change-master-password failed.

After many hours we worked out that you first needed to:
Browse to the folder: \AppDynamics\Controller\appserver\glassfish\domains\domain1\config and type the following commands to set the password back to the default one.

keytool -storepasswd -keystore keystore.jks
Enter keystore password:
New keystore password: changeit
Re-enter new keystore password: changeit
keytool -keypasswd -alias glassfish-instance -keystore keystore.jks
Enter keystore password: changeit
Enter key password for :
New key password for : changeit
Re-enter new key password for : changeit
keytool -keypasswd -alias reporting-instance -keystore keystore.jks
Enter keystore password: changeit
Enter key password for :
New key password for : changeit
Re-enter new key password for : changeit
keytool -keypasswd -alias s1as -keystore keystore.jks
Enter keystore password: changeit
Enter key password for :
New key password for : changeit
Re-enter new key password for : changeit
The most important step, which is missed in the documentation, if you’ve secured this file using anything but ‘changeit‘ then it’ll cause the above asadmin command to fail everytime, even half way through an upgrade.
keytool -storepasswd -keystore cacerts.jks
Enter keystore password:
New keystore password: changeit
Re-enter new keystore password: changeit

Other possible issues which you might encounter:

JRE Path is incorrect

Error: could not open `\AppDynamics\Controller\jre\lib\amd64\jvm.cfg’ The above error is telling you that your JRE path is messed up and it cannot locate the required files.

Solution:

  • check location of above file
  • Edit \AppDynamics\Controller\appserver\glassfish\config\asenv.bat
    • Make sure ‘set AS_JAVA=’ is point to the root of the JRE with the above file.

Error:

Stage [Discover Controller SSL certificate] failed due to
[Task failed: Discovering SSL certificate
on host: HOST
as user: HOST$
with message: Keystore was tampered with, or password was incorrect]

Solution: When upgrade the controller from 4.5 to 4.5, you can set your homemade password in for Glassfish / Keystore / Keys in controller.groovy

keypasswd = "changeit"
storepasswd = "changeit"

but this still failed until we had set all our passwords back to ‘changeit‘ see above on how to do this, then the above controller.groovy file to ‘changeit‘ as well.

Error:

Task failed: Starting Reporting Service
on host: HOST
as user: HOST$
with message: Expected is [RUNNING], but actual result is [STOPPED].

Solution:

Stop the AppDynamics Reporting Service and restart it, then continue the installation back in the AppDynamics 4.5 Enterprise Console.

Good luck out there, upgrading AppDynamics is not an easy task!

We are now looking to move to SaaS because it caused us so many issues.

Windows 10 Update October 2018 / 1809 : No browser website access?

I installed Windows 10 1809 update from here

Bypassing my domain group policy which has greyed out the ‘check for updates’ button. This is slightly annoying in itself.

After installation was completed, I noted that random things were happening when browsing the internet, this was on all the web browsers that I had installed (Chrome, Edge, Firefox). Things like no pages being displayed to browsers working for the first 5 minutes since a fresh reboot, to Chrome crashing and refusing to open again that session. Odd! so I checked whether I could do a

nslookup google.com

From the command prompt.

This was still working as expected. So the network card and settings weren’t the cause of my strife.

I had already upgrade two other machines without any issues, so what was going on? The only difference to the machines were the working ones were laptops using WiFi, the broken one was a desktop using a connected ethernet card.

After a day and a half tweaking / reading the internet, the solution lied with Malwarebytes ! I removed the installation and everything started working again. The installation had recently upgraded itself.

I’ve no idea why and frankly don’t care as it caused me so much hassle to figure out what was going on.

I’ve still got Malwarebytes installed on the two other machines and they are still working as expected, so there must be a buggy version out there that halts the internet access or it detected malware and shut the internet down when using the 1809 update.

Hope this blog post helps someone else too.

Oracle Scheduler Jobs and Email Notifications vs crontab

Question: How do I run scheduled jobs and get a notification via email?

The above is something that our business does all the time, but the answer was, to do that you’ll have to create a crontab direct job on the server.

I thought surely not.. So I thought about it and with a bit of Googling I figured out that the above wasn’t true, it’s possible to do these as scheduled jobs in SQL Developer.

Create Stored Procedure

  • In SQL Developer create a new Procedure
    • Right click on ‘Procedures’ and select ‘New Procedure’, give it a logical name describing what it does, spaces are not allowed, use underscores.
  • Type in your SQL over where the blank Procedure template says ‘NULL;’

Create Scheduled Job

Or jump straight to creating the job if you want to run a block of SQL which doesn’t suit being a Procedure, you can put the SQL directly into the job.

  • Go to the Schema in your database which has the rights to run a scheduled job
  • Under there expand ‘Scheduler’ and ‘Jobs’, right click on ‘Jobs’ and select ‘New Job (Wizard)…’
  • Fill in the field like below
    • Use logical names and give a full description of what the job is doing
    • Either put the SQL in directly in ‘PL/SQL Block’ or select the Procedure you created

JobWizard

When to execute the job?

  • use the ‘When to Execute Job’ drop-down to select ‘Repeating’ and click the pencil this will make the ‘Repeat Interval’ dialog box appear, shown above
  • Select required time and days for the job to run, click ‘OK’.
    • I selected Daily as well as the actual days, just to be sure 🙂
  • Once happy with your choices click ‘Next >’
  • Set Destination to ‘Local’ for it to run on the server, click ‘Next >’
  • Skip over ‘Job Arguments’ to ‘Notification’
  • Now add in ‘job_succeded’ just whilst you are checking the whether your emails are working when the job ran, add it by holding down Ctrl and clicking it, otherwise you will lose your current default options of : job_broken, job_chain_stalled, job_failed, job_over_max_dur, job_sch_lim_reached
  • Move Date: %event_timestamp% from the body to the bottom, as I’ve noticed that it doesn’t create a carriage return after it so will bunch up all the notification output.
    • Also fix the line Error code: by moving the end % back up a line from Error message. This might be a bug on my version of SQL Developer and will be fixed on yours.

From like this:

Retry count: %retry_count%
Error code: %error_code
%Error message: %error_message%

To like this:

Retry count: %retry_count%
Error code: %error_code%
Error message: %error_message%
  • Now Next through the rest of the setting and click ‘Finish’

How to Set up the email side of things

  • Edit and run the following SQL using your System account
BEGIN 

DBMS_SCHEDULER.set_scheduler_attribute('email_server', 'outlook.blah.co.uk:25');
DBMS_SCHEDULER.set_scheduler_attribute('email_sender', 'noreply@blah.co.uk');

END;

That should be it, all that is left to do is to run your job. You can do that by right clicking the job and selecting ‘Run Job…’

Now when people start to automate jobs, they will be visible to your whole team, rather than hidden away on the server in a crontab.

How to Update an expiring / expired Certificate

This is a process which we have to do every 2 years, so I thought I’d better create a post about it, so I don’t lose it.

  • Create CSR (Certificate Signing Request) using local or server IIS – Other methods are avaliable ie: open SSL
    • Open IIS, click ‘Server Certificates’, click ‘Create Certificate Request…’
    • Upload that CSR to your chosen certificate provider and purchase certificate
  • Once request is approved download the Certificate bundle
  • Extract bundle
  • Complete Certificate in IIS where you created the CSR
    • Open IIS, click ‘Server Certificates’, click ‘Complete Certificate Request…’
  • Export Certificate as PFX
    • Open IIS, click ‘Server Certificates’, Right click certificate and Export
  • Import the PFX to the requires servers
    • Run certlm.msc, Personnel, Certificates, Right click and Import, point at PFX
  • Delete the old Expiring Certificate
    • Run certlm.msc, Personnel, Certificates, Find old certificate based on expiry date and delete.
  • Change IIS site binding for 443 or required secure port to point at the new certificate
    • Open IIS, browse in tree to show site, click ‘Bindings’, find SSL port double click and change Certificate in ‘SSL certificate:’ drop down box.

Test new certificate is working

  • Browse to site in Chrome, right click the Secure / Padlock area click ‘Certificate (Valid)’ check Valid From / To entries.

Moodle alerts using New Relic Query Language NRQL

I’ve been using New Relic to monitor our Moodle installation for the past 4 years. Recently New Relic have added some lovely alerting features which use their NRQL language to trigger any application issues.

With my recent change of job roles (Technology Enhanced Learning Manager) I have now being using NR on a daily basis, trying to get the best out of it.

Whilst reading through the help on NRQL alerting available on NR site, NR blog and in the NR forums I hit upon a few issues.

What if you know your site performs early morning syncs with other systems and these syncs will always trigger your alerts. Something that I didn’t want happening. How can you stop this from happening?

This is how I managed it:

NRQL Database spike alert : ignores 2 to 4:59 am

SELECT average(databaseDuration) * 100 FROM Transaction WHERE hourOf(timestamp) IN ('0:00','1:00','5:00','6:00','7:00','8:00','9:00','10:00','11:00','12:00','13:00','14:00','15:00','16:00','17:00','18:00','19:00','20:00','21:00','22:00','23:00') with TIMEZONE 'Europe/London'

The above is checking average database duration in the Transaction only within the hours 0:00-01:59 skipping 02:00 to 4:59 and checking during 05:00-23:59. The * 100 gives the query value the same representation as what you can see in the APM / Database section.

NRQL Errors > 20% : ignores 2 to 4:59 am

SELECT filter(count(*), WHERE `error.message` is not null)*100 / filter(count(*), WHERE duration is not null) as 'Errors' from Transaction, TransactionError WHERE hourOf(timestamp) IN ('0:00','1:00','5:00','6:00','7:00','8:00','9:00','10:00','11:00','12:00','13:00','14:00','15:00','16:00','17:00','18:00','19:00','20:00','21:00','22:00','23:00') with TIMEZONE 'Europe/London'

The above is converting the query value into the same as what you see when looking at the Error rate on the APM / Overview. This is checking in the Transaction only within the hours 0:00-01:59 skipping 02:00 to 4:59 and checking during 05:00-23:59.

NRQL APDEX < 0.5 for 2 minutes

SELECT apdex(duration, 0.5) FROM Transaction WHERE hourOf(timestamp) IN ('0:00','1:00','2:00','3:00','4:00','5:00','6:00','7:00','8:00','9:00','10:00','11:00','12:00','13:00','14:00','15:00','16:00','17:00','18:00','19:00','20:00','21:00','22:00','23:00') with TIMEZONE 'Europe/London'

The above is converting the query value into the same as what you see when looking at the Apdex score APM / Overview. This is checking in the Transaction within all 24hrs in a day.

I’ve got this set to only trigger if the duration is longer than 2 minutes as this can trigger in the early am. When using the threshold type of static and query value below 0.5 you cannot use ‘hourOf(timestamp) IN’ and skip some hours as those hours will trigger as zero which will throw the alert.

When I first tried setting up these alerts I was using ‘hourOf(timestamp) NOT IN’ rather than ‘hourOf(timestamp) IN’ it was thanks to a NR support person who helped me figure out that ‘NOT IN’ wasn’t working correctly.

Hopefully you can get some use out of the above, please leave any examples in the comments below.

Automate copying IIS logs to SQL Database using Powershell, Log Parser and Task Scheduler

I decided not to go down the route of writing an SSIS package as mentioned in my previous post

So how else can you do what I wanted to achieve?

  • Remove all copied logs from the Database server
  • Copy yesterdays logs from 9 web servers
  • Import the logs to a SQL Database

This is how I did it:

Step 1 : Create 3 Powershell scripts

Copy IIS files from yesterday:

$Date = Get-Date
$Date = $Date.adddays(-1)
$Date2Str = $Date.ToString("yyyMMdd")

$Files1 = gci "\\server1\c$\inetpub\logs\LogFiles\W3SVC2"
ForEach ($File in $Files1){
     $FileDate = $File.creationtime
     $CTDate2Str = $FileDate.ToString("yyyyMMdd")
     if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Logs\\1718\\server1"}
}

$Files2 = gci "\\server2\c$\inetpub\logs\LogFiles\W3SVC2"
ForEach ($File in $Files2){
     $FileDate = $File.creationtime
     $CTDate2Str = $FileDate.ToString("yyyyMMdd")
     if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Logs\\1718\\server2"}
}

$Files3 = gci "\\server3\c$\inetpub\logs\LogFiles\W3SVC2"
ForEach ($File in $Files3){
     $FileDate = $File.creationtime
     $CTDate2Str = $FileDate.ToString("yyyyMMdd")
     if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Logs\\1718\\server3"}
}

Remove all local iis log files:

$ErrorActionPreference = "Stop"

Import-Module Pscx -EA 0

function RemoveLogFiles
{
    Write-Host "Removing log files..."
    Remove-Item ($httpLogPath1)
	Remove-Item ($httpLogPath2)
	Remove-Item ($httpLogPath3)
	Remove-Item ($httpLogPath4)
	Remove-Item ($httpLogPath5)
	Remove-Item ($httpLogPath6)
	Remove-Item ($httpLogPath7)
	Remove-Item ($httpLogPath8)
	Remove-Item ($httpLogPath9)
}

function Main
{	

	[string] $httpLogPath1 = "C:\Logs\1718\server1\*.log"
	[string] $httpLogPath2 = "C:\Logs\1718\server2\*.log"
	[string] $httpLogPath3 = "C:\Logs\1718\server3\*.log"

	[string] $httpLogPath4 = "C:\Logs\1718\server4\W3SVC1\*.log"
	[string] $httpLogPath5 = "C:\Logs\1718\server5\W3SVC1\*.log"
	[string] $httpLogPath6 = "C:\Logs\1718\server6\W3SVC1\*.log"

	[string] $httpLogPath7 = "C:\Logs\1718\server7\W3SVC2\*.log"
	[string] $httpLogPath8 = "C:\Logs\1718\server8\W3SVC2\*.log"
	[string] $httpLogPath9 = "C:\Logs\1718\server9\W3SVC2\*.log"

    RemoveLogFiles 

    Write-Host -Fore Green "Successfully removed log files."
}

Main

Import Website Logs to Database:

$ErrorActionPreference = "Stop"

Import-Module Pscx -EA 0

function ExtractLogFiles(
    [string] $httpLogPath)
{
    If ([string]::IsNullOrEmpty($httpLogPath) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $httpLogArchive = $httpLogPath + "c:\logs\1718\*.log"

    If ((Test-Path $httpLogArchive) -eq $false)
    {
        Write-Host "Creating archive folder for compressed log files..."
        New-Item -ItemType directory -Path $httpLogArchive | Out-Null
    }

    Write-Host "Extracting compressed log files..."

    Get-ChildItem $httpLogPath -Filter "*.zip" |
        ForEach-Object {
            Expand-Archive $_ -OutputPath $httpLogPath

            Move-Item $_.FullName $httpLogArchive
        }
}

function ImportLogFilesServer1(
    [string] $httpLogPath1)
{
    If ([string]::IsNullOrEmpty($httpLogPath1) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe "

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
			+ ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath1"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=;Database=;Trusted_Connection=yes;"

    [string[]] $parameters = @()

	$parameters += $query
    $parameters += "-i:W3C"
	$parameters += "-e:-1"
	#$parameters += "-recurse:-1"
    $parameters += "-o:SQL"
	$parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    & $logParser $parameters
}

function ImportLogFilesServer2(
    [string] $httpLogPath2)
{
    If ([string]::IsNullOrEmpty($httpLogPath2) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe "

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
			+ ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath2"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=;Database=;Trusted_Connection=yes;"

    [string[]] $parameters = @()

	$parameters += $query
    $parameters += "-i:W3C"
	$parameters += "-e:-1"
	#$parameters += "-recurse:-1"
    $parameters += "-o:SQL"
	$parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    & $logParser $parameters
}

function ImportLogFilesServer3(
    [string] $httpLogPath3)
{
    If ([string]::IsNullOrEmpty($httpLogPath3) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe "

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
			+ ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath3"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=;Database=;Trusted_Connection=yes;"

    [string[]] $parameters = @()

	$parameters += $query
    $parameters += "-i:W3C"
	$parameters += "-e:-1"
	#$parameters += "-recurse:-1"
    $parameters += "-o:SQL"
	$parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    & $logParser $parameters
}

function Main
{
	[string] $httpLogPath1 = "C:\Logs\1718\Server1\*.log"
	[string] $httpLogPath2 = "C:\Logs\1718\Server2\*.log"
	[string] $httpLogPath3 = "C:\Logs\1718\Server3\*.log"

#Repeat above for all other server

    ImportLogFilesServer1 $httpLogPath1
    ImportLogFilesServer2 $httpLogPath2
    ImportLogFilesServer3 $httpLogPath3	

#Repeat above for all other server

    Write-Host -Fore Green "Successfully imported log files."
}

Main

Finally you need to setup some scheduled tasks:

There’s a few little bits here which wouldn’t work for me without the following arguments:

Powershell.exe in Program/script:

-ExecutionPolicy Bypass -file “C:\powershell iis scripts\remove all iis log files.ps1” in Add arguments (optional):

task

The ‘Copy IIS files from yesterday’ script required some work with our server and storage team to add a service account to be able to run batch scripts without being logged on, this is in the Local Group Policy, This was due to being on a domain and the policy was set to not allow me to set this for my user. More details to come on this, its time consuming to explain in great detail.

 

 

 

IIS Logs to SQL Database using Powershell and Log Parser

Our IIS logs are ridiculously large, every time I am asked to find out some stats it takes me ages copying the files to my machine and running some log parser SQL on it.

So I thought it’s time to insert them into a SQL Database. I adapted a script which I mostly found here and used the following to do the import. At first I was getting the following error which I couldn’t figure out how to solve SQL table column “csUsername” date type is not compatible with SELECT clause item “csUsername” (type INTEGER)

PowershellSqlerror

This was being caused because some of the log files had the username as +12345678 or 12345678+ therefore no longer being an INTEGER and not being able to be imported.

To fix this look at line 27 of the following Powershell script to force csUsername to be a VARCHAR rather than an INTEGER

$ErrorActionPreference = "Stop"

Import-Module Pscx -EA 0

function ImportLogFiles(
    [string] $httpLogPath)
{
    If ([string]::IsNullOrEmpty($httpLogPath) -eq $true)
    {
        Throw "The log path must be specified."
    }

    [string] $logParser = "${env:ProgramFiles(x86)}" `
        + "\Log Parser 2.2\LogParser.exe"

    [string] $query = `
        [string] $query = `
        "SELECT" `
            + " LogFilename" `
            + ", RowNumber" `
            + ", TO_TIMESTAMP(date, time) AS EntryTime" `
            + ", s-ip AS sIp" `
            + ", cs-method AS csMethod" `
            + ", cs-uri-stem AS csUriStem" `
            + ", cs-uri-query AS csUriQuery" `
            + ", s-port AS sPort" `
            + ", TO_STRING(cs-username) AS csUsername" `
            + ", c-ip AS cIp" `
            + ", cs(User-Agent) AS csUserAgent" `
            + ", cs(Referer) AS csReferer" `
            + ", sc-status AS scStatus" `
            + ", sc-substatus AS scSubstatus" `
            + ", sc-win32-status AS scWin32Status" `
            + ", time-taken AS timeTaken" `
        + " INTO IisLogs" `
        + " FROM $httpLogPath"

    [string] $connectionString = "Driver={SQL Server Native Client 11.0};" `
        + "Server=&amp;amp;lt;Database Server&amp;amp;gt;;Database=&amp;amp;lt;Database&amp;amp;gt;;Trusted_Connection=yes;"

    [string[]] $parameters = @()

    $parameters += $query
    $parameters += "-i:W3C"
    $parameters += "-e:-1"
    $parameters += "-o:SQL"
    $parameters += "-createTable:ON"
    $parameters += "-oConnString:$connectionString"

    Write-Debug "Parameters: $parameters"

    Write-Host "Importing log files to database..."
    &amp;amp;amp; $logParser $parameters
}

function Main
{
    $httpLogPath = "&amp;amp;lt;Path to your logs&amp;amp;gt;\*.log"

    ImportLogFiles $httpLogPath

    Write-Host -Fore Green "Successfully imported log files."
}

Main

Now I just need to write an SSIS package which will update the database with the IIS log files from 3 server and two sites (daily), so I’ll be back to update you on how I got along.

Okay Automate copying IIS logs to SQL Database using Powershell, Log Parser and Task Scheduler post is now written, continue reading to find out how

Writable web directory in IIS 7.5

Recently I followed the advice re this topic on Stack Overflow, this had me add the IIS AppPool\DefaultAppPool or AppPool for your website to the security permission of the folder you wanted to write to.

This didn’t work for me.

What did work was when I allowed the local server Users group the access to modify the folder.

user-permissions

Migrating Oracle VB.net application from Windows 2003 to 2012

Lessons learnt whilst migrating a VB.net application from a Windows 2003 server to a Load balanced Windows 2012 server stack. Unfortunately it wasn’t as easy as deploying the old code on a new version of IIS.

One of the first issue that I had to solve was that the Oracle System.Data.OracleClient is now deprecated, therefore I needed to use oracle managed data access client Oracle.DataAccess.Client in order to connect to an Oracle DB using IIS 8.5

My web.config now looks something like this:

No longer available

The next task was to figure out where the oracle database would randomly drop its connection, I did this by debugging the code to VS2017,  I then had to open oracle datareader at several points, although the connection remained open at that same point when using the old oracle client.

A potentially dangerous Request.QueryString value was detected from the client. The application in question was creating one parameter in a report URL which was being formed using the ID from radio buttons selected. The HTML which was formed for the page changed between deploying and running the code on 2003 to how it appeared when it was being ran on a 2012 server. On the new server I was seeing that the parameter was passing in the wrong part of the HTML. Therefore it put in **** as a URL parameter. If at any point IIS or your browser sees this type of behavior it throws the above error.

How to fix this?

Two ways the easy way and the hard.

The easy way, which is not recommended

insert ValidationRequest=false into the page header for each ASP page as well as like this in your web.config, this shuts off the security feature and will leave your site open to scripting attacks:

No longer available

The hard way, figure out why the code is inserting a piece of HTML in as a parameter within a URL. It was doing this in my code due to the way that the HTML removed spaces, my code needed to determine where it was to grab the id of the radio button so was trying to locate a set point by reading in the ID of the radio button, because that was after value=2 > and it was looking for ‘value=’ + ID + ” “. Which it couldn’t find. as that was now written as value=2>.

But this won’t mean anything to you, as your code won’t be throwing an error because of the same reasons as above! Or at least for your sake I hope not!!

SSL for BI Publisher 10.1.3.4 Standalone

*** EV certificates will not work on this version of BI Publisher ***

We have a old installation of Oracle Business Intelligence 10.1.3.4 which somehow has not been kept up to date – This happens through all businesses throughout the planet, the rule: If it works don’t touch it, applies to it.

So this server isn’t still running as HTTP, we needed to get it secured ASAP – 9 years late is better than never.

So how do you go about it?

Create a Java Keystore

For this you need to use the Keytool utility which comes with Java JDK / JRE on your server. See notes about this later on as this is a very outdated version of the Keytool utility.

Command to create a Java Keystore:

keytool -genkey -keyalg RSA -alias mykey -keystore mykeystore.jks

Enter a keystore password and remember it, I recommend that you use  Password Safe to generate and remember the password.

Create and Update secure-web-site.xml file

Make a copy of your existing file default-web-site.xml and name it secure-web-site.xml edit the new file and add secure=”true” to the end of the line as well as change your port, 9704 is HTTP as 9706 is the standard BI HTTPS port:

http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="http://xmlns.oracle.com/oracleas/schema/web-site-10_0.xsd" port="9706" display-name="OC4J 10g (10.1.3) Default Web Site" schema-major-version="10" schema-minor-version="0" secure="true"&amp;amp;amp;amp;amp;gt;

Now add the line:

<ssl-config keystore=”full_path_to_your_keystore.jks” keystore-password=”your_keystore_password” />

Before the closing tag

Register secure-web-site.xml file in server.xml

Towards the bottom of your server.xml file add:

<web-site default=”true” path=”./secure-web-site.xml” />

Remove the line referencing the ./default-web-site.xml you no longer need HTTP access once you have HTTPS enabled. You’ll need to do something which I didn’t to have both HTTP and HTTPS URL’s accessible on a reboot the server will automatically delete the second

Add a local firewall rule for the new 9706 port

Use ‘Windows Firewall’ and look for an existing Inbound rule for 9704, copy this rule and change the port number to 9706.

Test if HTTPS URL is working

Save the changes and restart ‘Oracle BI EE OC4J‘ server in ‘Services’ to reflect the changes, that or reboot the server.

Browse to : https://:9706/xmlpserver/ using the servers available web browser or your local machine, as there could be corporate firewalls in the way still, these will not know about you new port 9706 so will need to be enabled.

A quick test from a command prompt on your local machine to see is:

Telnet YourServer 9706

Secure your new address with a certificate

Go back to where you created your mykeystore.jks file and delete the mykey from within it:

keytool -delete -alias mykey -keystore mykeystore.jks

Now follow the following steps:

  • Request your certificate from your certification authority
  • Complete your certificate request in IIS or elsewhere
  • Export your your certificate as a .PFX file
  • Extract from the PFX your Key and Pem files
openssl pkcs12 -in yourCertificate.pfx -out yourCertificate.pem
  • Convert yourCertificate.pem to .pkcs12 file
openssl pkcs12 -export -in yourCertificate.pem -out yourCertificate.pkcs12
  • Now and very importantly move the yourCertificate.pkcs12 file away from your Oracle BI server as this will have a very old version of Java Keytool on it, which does not support the -importkeystore command, mine was running JRE 1.4.2.
  • Also move the empty mykeystore.jks file to the same directory as the yourCertificate.pkcs12
  • Check your mykeystore.jks to check it is empty
keytool&amp;amp;amp;amp;amp;nbsp;-v -list -keystore mykeystore.jks
  • On the server with JRE / JDK 1.6 or above, I ran this using JRE 1.8, run the following command to import yourCertificate.pkcs12 to the mykeystore.jks
"C:\Program Files (x86)\Java\jre1.8.0_131\bin\keytool.exe" -v -importkeystore -srckeystore yourCertificate.pkcs12 -srcstoretype PKCS12 -destkeystore mykeystore.jks -deststoretype JKS
  • Now check your mykeystore.jks again and you should now have the imported certificate
  • Move this mykeystore.jks to the Oracle BI server

Restart ‘Oracle BI EE OC4J‘ server in ‘Services’ to reload the application with the new certificate or reboot the server.

You should now have a lovely secured server, if at first it doesn’t show check it on a browser that you haven’t used yet or clear your cache.