Following up on the previous blog posts about how easily to create a REST server with kbmMW, I today want to write a little bit about logging.
kbmMW contains a quite sophisticated logging system, which lets the developer log various types of information whenever the developer needs it, and at runtime lets the administrator decide what type of log to react on and how.
In addition the log can be output in a file, in the system log (OS dependant), or be sent to a remote computer for storage. In fact all the above methods can coexist at once.
What’s the purpose of logging?
Well. There can be multiple purposes, amongst others:
- For debugging while developing
- For debugging after deployment
- For keeping track of resources
- For keeping track of usage (perhaps even relates to later invoicing)
- For proving reasons for user complaints
- Of security reasons to track who is doing what
As you can tell, there seems to be various log requirements for various stages of the lifetime of the application:
- During development
- During usage
- Early warning
- Post incident investigation
A good log system should imo handle all the above scenarios, while making it simple to use for the developer, and allow the administrator to tune on the amount of information needed.
kbmMW’s log system handles all these scenarios, and can be late fine tuned for the required log level.
In addition the log system should be able to output the log in relevant formats, that match the application’s purpose.
Web server applications, might want to output some log data in a format generally accepted by web servers, and thus also by web server log file analyzer software, while other server applications may have other requirements for output.
kbmMW supports several output formats, and also allows adding additional formats, without having to make changes in the developer’s logging statements.
So let us get on with it.
First add the kbmMWLog unit to the units in which you expect to do some logging.
In our case, we have the units Unit7 (main form unit), Unit8 (Smart service unit… the actual REST business code) and Unit9 (a defined sharable TContact object).
It makes sense to add support for logging in Unit7 and Unit8. In Unit7 it would look similar to this:
interface uses Winapi.Windows, Winapi.Messages, System.SysUtils, System.Variants, System.Classes, Vcl.Graphics, Vcl.Controls, Vcl.Forms, Vcl.Dialogs, kbmMWCustomTransport, kbmMWServer, kbmMWTCPIPIndyServerTransport, kbmMWRESTTransStream, kbmMWCustomConnectionPool, kbmMWCustomSQLMetaData, kbmMWSQLiteMetaData, kbmMWSQLite, kbmMWORM, IdBaseComponent, IdComponent, IdServerIOHandler, IdSSL, IdSSLOpenSSL, IdContext, kbmMWSecurity, kbmMWLog;
And in Unit8 we have also added kbmMWLog to the uses clause.
By simply adding this unit, we can already log by calling one of the methods of the public default available Log instance. Eg.
Log.Debug('some debug information'); Log.Info('2 + 2 = %d',[2+2]);
kbmMW’s log system supports these easy access methods:
- Debug (typically used during development purposes),
- Info (inform about some non critical and non error like information)
- Warn (inform about some non critical anormal situation)
- Error (inform about some error, like an exception or something else which still allow the application to continue to operate)
- Fatal (inform about an error of such magnitude that the application no longer can run).
- Audit (inform about some information that you want to be used as evidence in a post analysis scenario).
They in turn calls a number of generic TkbmMWLog.Log method which takes arguments for log type, severity, timestamps and much more.
You can ask kbmMW to log content of streams, of memory buffers, XML and JSON documents, byte arrays, and you can even ask kbmMW to produce a stack trace along with your log (not currently supported on NextGen platforms).
In our simple REST server, we might want to log whenever a user logs in, when they are logged out, when a function is called, and when an exception happens.
To intercept the login situation, we will write some event handlers for the OnLoginSuccess and OnLoginFailed event on the TkbmMWAuthorizationManager instance we have on Unit7.
procedure TForm7.kbmMWAuthorizationManager1LoginFail(Sender: TObject; const AActorName, ARoleName, AMessage: string); begin Log.Warn('Failed login attempt as %s with role %s. %s',[AActorName,ARoleName,AMessage]); end; procedure TForm7.kbmMWAuthorizationManager1LoginSuccess(Sender: TObject; const AActorName, ARoleName: string; const AActor: TkbmMWAuthorizationActor; const ARole: TkbmMWAuthorizationRole); begin Log.Info('Logged in as %s with role %s',[AActorName,ARoleName]); end;
It makes sense to log a successful login as an information, while an unsuccessful login is logged as a warning. If it happens often, it could be malicious login attempts, so warnings ought to be looked after.
And we might also want to log a logout of a user. The logout may happen automatically due to the user being idle for too long. Refer to the previous blog post for more information.
We might also want to log what calls are made by logged in users.
This can be done in many ways and many places. You could choose to do it within your business logic code in the smart service in Unit8, which makes sense if you want to log some more specific information about the call.
But if you just want to log successful and failed calls, then it’s easy to do so using the OnServeResponse event of the TkbmMWServer instance in Unit7.
As long as the request is formatted correctly and thus served through the TkbmMWServer, it will be attempted to be executed, and a response sent back to the caller.
The execution may succeed or it may fail, but in all cases the OnServeResponse event will be triggered.
procedure TForm7.kbmMWServer1ServeResponse(Sender: TObject; OutStream: IkbmMWCustomResponseTransportStream; Service: TkbmMWCustomService; ClientIdent: TkbmMWClientIdentity; Args: TkbmMWVariantList); begin if OutStream.IsOK then Log.Info('Successfully called %s on service %s',[ClientIdent.Func,ClientIdent.ServiceName]) else Log.Error('An error "%s" happened while serving request for %s on %s',[ClientIdent.Func,ClientIdent.ServiceName,OutStream.StatusText]); end;
Now we intercepts and logs at strategic places in our code, and in fact the logging is already working. But the log output is currently only placed on the system log, which on Windows is interpreted as the debugger.
We need to have our log output to a file, preferably with nice chunking when the file reach a certain size.
The responsibility of the actual output, is the log manager. There are a number of log managers included with kbmMW:
- TkbmMWStreamLogManager – Sends log to a TStream descendant.
- TkbmMWLocalFileLogManager – Sends log to a file.
- TkbmMWSystemLogManager – Sends log to system log (depends on OS).
- TkbmMWStringsLogManager – Sends log to a TStrings descendant.
- TkbmMWProxyLogManager – Proxies log to another log manager.
- TkbmMWTeeLogManager – Sends log to a number of other log managers.
- TkbmMWNullLogManager – Sends log to the bit graveyard.
If you have kbmMW Enterprise Edition and thus also have access to the WIB (Wide Information Bus) publish/subscribe transports, you have a couple of additional log managers available for remote logging:
- TkbmMWClientLogManager – Publishes logs via the WIB
- TkbmMWServerLogManager – Subscribes for logs on the WIB, and forwards those through other log managers.
You can make your own log manager by descending from TkbmMWCustomLogManager and implementing the IkbmMWLogManager interface.
To use a different log manager than the default system log manager, you simply create an instance of the log manager you want to use and assign it to the TkbmMWLog.Log.LogManager property. Eg.
However to set specific settings on the log manager, it is better to instantiate a variable with it, set its properties and then later assign that variable to the Log.LogManager property.
An even easier way, is to use one of the Log.Output….. methods, which easily creates relevant log managers for you with settings that usually are good for most circumstances. Eg.
This will in fact create 3 log managers, a system log manager, a file log manager and a tee log manager and automatically hooks them all up.
In our case we just want to output to a file, so let us stick with the TkbmMWLocalFileLogManager. So we will simply create an instance and assign it to the Log.LogManager as shown above.
Now all the log will be output to the file, and the file will automatically be backed up and a new created when it reaches 1MB size. Backup naming and size etc. are all configurable on the TkbmMWLocalFileLogManager instance.
You can control which fields are output via the Log.LogManager.LogFormatter property. It is default a TkbmMWStandardLogFormatter. kbmMW also supports a TkbmMWSimpleLogFormatter which only outputs date/time, type and the actual log string.
The standard log formatter also outputs data type, process and thread information and binary data (usually converted to either Base64 or hexdump (pretty) format).
There are much more to logging. We didn’t touch the fact that the log system can handle separate log files for auditing and other logging, and that you can set filtering on each log manager so that particular log manager only logs certain log types or log levels or data types.