kbmMW Binary Parser #1

A universal binary parser made available as part of kbmMW Enterprise Edition

Next version of kbmMW Enterprise Edition will include a definition file based (for the moment, fixed record) binary parser.

What does it do? Well.. it parses well formed binary (or textual) streams to extract telegrams and their contents. It functionality wise can be compared to a regular expression, just for bit and byte level information, although with simple scripting and calculation capabilities.

A telegram is in this sense a fixed length bunch of bytes, which may contain bit fields or byte fields or ASCII type string data.

The definition file defines how the telegrams are looking, what subparts they consist of, and what to do when a matching part has been found.

The outcome is typically a match along with a number of keys/values, or a failure to match with anything. The actual naming and use of the keys and their values is up to the developer to decide.

A definition file is default written in YAML and consists of 3 main sections:

  • VALUES
  • TELEGRAMS
  • TAGS

The VALUES section can contain a list of predefined values to be available before any attempt to match anything. This can for example be used for defining “constants” which your application understands or default values.

The TELEGRAMS section contains an array of the telegram masks to look for, and the TAGS section contains an optional number of named sub parts referenced from either the TELEGRAMS section or from within the TAGS section.

It may seem a bit vague right now, but it probably makes more sense when I show a sample in a moment.

The TELEGRAMS section and the TAGS section both contains masks and optional expressions to be executed when a mask match. They also both define if the mask is a bytes mask, a bits mask or a string mask.

Masks which has been defined as bytes masks, always operates on the byte level. Similarly masks which has been defined as bits masks, always operates on the bit level (currently maximum 8 bit per mask).

String masks are similar to bytes masks, except that they compare ASCII strings.

Let us look at a sample definition file. As YAML actively is using indentation to determine if something belongs to current definition or is a new definition, it is of high importance that the indentation is correct. YAML also recognizes lines starting with – as an entry in an array, unless the dash seems to be part of another property. In fact YAML is pretty complex in what it understands, but it does read easier for the human eye, why I chose it as the default definition file format.

The sample definition file is for a standard scale format called Toledo deviced by a company called Mettler Toledo many years ago.

YAML wise, the document actually contains an object with one single property named TOLEDO, which has 3 properties, VALUES, TELEGRAMS and TAGS.
The VALUES property has a number of properties with values. The TELEGRAMS object has one property with an array of objects each having a mask and optional expr property.

The TAGS property is an object which has a number of properties (SWA, SWB, SWC, DP etc) which each are objects containing a property named bytes/bits/string which is an object containing either a single mask and optional expr property, or an array of such.

It may take a while getting used to read and write YAML documents, but perseverance makes experts.

Lines starting with # are comments.

# This is a sample file showing how to parse Toledo telegrams
# using kbmMW Binary Parser

TOLEDO:
    VALUES:
        # Unit constants
        C_UNIT_GRAM:                 2000
        C_UNIT_UK_POUND:             2001
        C_UNIT_KILOGRAM:             2002
        C_UNIT_METRIC_TON:           2003
        C_UNIT_OUNCE:                2004
        C_UNIT_TROY_OUNCE:           2005
        C_UNIT_PENNY_WEIGHT:         2006
        C_UNIT_UK_TON:               2007
        C_UNIT_CUSTOM:               2008

        # Status constants
        C_STATUS_OK:                 1000
        C_STATUS_DATA_ERROR:         1001
        C_STATUS_SCALE_ERROR:        1002
        C_STATUS_SCALE_OVERLOAD:     1003
        C_STATUS_IN_MOTION:          1004
        C_STATUS_TARE_ERROR:         1005
        C_STATUS_TRANSMISSION_ERROR: 1006
        C_STATUS_INVALID_COMMAND:    1007
        C_STATUS_INVALID_PARAMETER:  1008

        # Tare constants
        C_TARE_PRESET:               3000
        C_TARE_AUTO:                 3001
        C_TARE_NONE:                 3002

        # Default values
        STATUS:                    @C_STATUS_OK
        TARE:                      0
        GROSS:                     0
        NET:                       0
        INCREMENT_SIZE:            1
        IS_POWER_NOT_ZEROED:       false
        IS_SETTLED:                false
        IS_OVERLOAD:               false
        IS_NEGATIVE:               false
        IS_CHECKSUM_OK:            false
        WEIGHT_FACTOR:             1
        TARE_FACTOR:               1
        TARE_CODE:                 @C_TARE_NONE
        TERMINAL_NO:               0
        WEIGHT_UNIT:               @C_UNIT_KILOGRAM
        TARE_UNIT:                 @C_UNIT_KILOGRAM

    TELEGRAMS:
         bytes:
              - mask: [ 0x2, @SWA, @SWB, @SWC, 6*@W, 6*@T, 0xD, @CHK ]
                expr: - "WEIGHT_UNIT=IF(IS_UNIT_UK_POUND=1,C_UNIT_POUND,IF(IS_UNIT_KILOGRAM,C_UNIT_KILOGRAM,WEIGHT_UNIT))"
                      - "TARE_UNIT=WEIGHT_UNIT"
                      - "STATUS=IF(IS_CHECKSUM_OK=1,IF(IS_OVERLOAD,C_STATUS_OVERLOAD,C_STATUS_OK),C_STATUS_DATA_ERROR)"
                      - "WEIGHT=WEIGHT*WEIGHT_EXPANSION*IF(WEIGHT_FACTOR<1,WEIGHT_FACTOR,1)"
                      - "TARE=TARE*TARE_EXPANSION*IF(TARE_FACTOR<1,TARE_FACTOR,1)"
                      - "GROSS=IF(IS_NETTO=0,WEIGHT,0)"
                      - "NET=IF(IS_NETTO=1,WEIGHT,0)"

    TAGS:

        SWA:
         bits:
                # bit offset 0
                mask: [ 0, 0, 1, 2*@IS, 3*@DP ] 

        DP:
         bits:
        # bit offset 0, 3 bits
              - mask: [ 0, 0, 0 ]
                expr: [ WEIGHT_FACTOR=100, TARE_FACTOR=100 ]
              - mask: [ 0, 0, 1 ]
                expr: [ WEIGHT_FACTOR=10, TARE_FACTOR=10 ]
              - mask: [ 0, 1, 0 ]
                expr: [ WEIGHT_FACTOR=1, TARE_FACTOR=1 ]
              - mask: [ 0, 1, 1 ]
                expr: [ WEIGHT_FACTOR=0.1, TARE_FACTOR=0.1 ]
              - mask: [ 1, 0, 0 ]
                expr: [ WEIGHT_FACTOR=0.01, TARE_FACTOR=0.01 ]
              - mask: [ 1, 0, 1 ]
                expr: [ WEIGHT_FACTOR=0.001, TARE_FACTOR=0.001 ]
              - mask: [ 1, 1, 0 ]
                expr: [ WEIGHT_FACTOR=0.0001, TARE_FACTOR=0.0001 ]
              - mask: [ 1, 1, 1 ]
                expr: [ WEIGHT_FACTOR=0.00001, TARE_FACTOR=0.00001 ]

        IS:
         bits:
              # bit offset 3, 2 bits
              - mask: [ 0, 1 ]
                expr: INCREMENT_SIZE=1
              - mask: [ 1, 0 ]
                expr: INCREMENT_SIZE=2
              - mask: [ 1, 1 ]
                expr: INCREMENT_SIZE=5

        CHK:
         bytes:
                expr: "IS_CHECKSUM_OK=IF(CHK2COMP7(0,17)=VALUE,1,0)"

        SWB:
         bits:
                mask: [ 0, IS_POWER_NOT_ZEROED, 1, IS_UNIT_UK_POUND/IS_UNIT_KILOGRAM, !IS_SETTLED, IS_OVERLOAD, IS_NEGATIVE, IS_NETTO ]

        SWC:
         bits:
                mask: [ 0, IS_HANDTARE, 1, @EW, IS_PRINTREQUEST, 3*@WF ]

        WF:
         bits:
              - mask: [ 0, 0, 0 ]

              - mask: [ 0, 0, 1 ]
                expr: [WEIGHT_UNIT=C_UNIT_GRAM, TARE_UNIT=C_UNIT_GRAM ]
              - mask: [ 0, 1, 0 ]
                expr: [WEIGHT_UNIT=C_UNIT_METRIC_TON, TARE_UNIT=C_UNIT_METRIC_TON ]
              - mask: [ 0, 1, 1 ]
                expr: [WEIGHT_UNIT=C_UNIT_OUNCE, TARE_UNIT=C_UNIT_OUNCE ]
              - mask: [ 1, 0, 0 ]
                expr: [WEIGHT_UNIT=C_UNIT_TROY_OUNCE, TARE_UNIT=C_UNIT_TROY_OUNCE ]
              - mask: [ 1, 0, 1 ]
                expr: [WEIGHT_UNIT=C_UNIT_PENNY_WEIGHT, TARE_UNIT=C_UNIT_PENNY_WEIGHT ]
              - mask: [ 1, 1, 0 ]
                expr: [WEIGHT_UNIT=C_UNIT_UK_TON, TARE_UNIT=C_UNIT_UK_TON ]
              - mask: [ 1, 1, 1 ]
                expr: [WEIGHT_UNIT=C_UNIT_CUSTOM, TARE_UNIT=C_UNIT_CUSTOM ]

        EW:
         bits:
              - mask: 0
                expr: [ WEIGHT_EXPANSION=1, TARE_EXPANSION=1 ]
              - mask: 1
                expr: [ WEIGHT_EXPANSION=10, TARE_EXPANSION=10 ]

        W:
         string:
                expr: WEIGHT=VALUE

        T:
         string:
                expr: TARE=VALUE

When the kbmMW Binary Parser is provided this definition file, it compiles it to build a parse tree, which efficiently can parse whatever you throw at it as a file or a stream.

We can see that one telegram mask has been defined in the TELEGRAMS/bytes array. It contains a mask that consists of 8 parts. Each part is, unless a * is included, exactly 1 byte wide.

The first part is 0x2 which simply means that the data must start with the hexadecimal value 2, which is STX (start of transmission) in the ASCII character set.

The second part is @SWA, which means that there must be one byte, which will be parsed by the tag called SWA.

The @SWB and @SWC also match one byte, that each of them must be parsed by a named tag.

Then we have the 6*@W part. That means that there are 6 bytes which must be parsed by the W tag.

You get the picture?

Let’s look at the SWA tag. It is defined as a bits tag. Hence it only parses bits and at most 8 of them. It has a mask defined as 0, 0, 1, 2*@IS, 3*@DP

That means that most significant bit should be 0, next one should also be 0, next should be 1, and then comes 2 bits which should be parsed by the IS tag, and then 3 lowest significant bits should be parsed by the DP tag.

Looking at the DP tag, you will see that is also a bits type tag, which makes sense since we are parsing a subset of bits from the SWA tag.

There are defined a number of possible DP bit masks, which, when matched, result in one or more expressions being executed.

So let’s say that the 3 bits matches 1 0 1. Then the expressions WEIGHT_FACTOR=0.001 and TARE_FACTOR=0.001 are both executed, essentially setting some values we can use later on, or explore from our program. Notice the []? In YAML that is called an inline array, where each element is separated by a comma. That is the reason why I mention that two expressions are executed in this case, when the match is successful.

The IS tag follow a similar procedure as the DP tag.

The SWB tag is an interesting one. It is used for parsing the 3rd byte of the data. It is also a bits type mask, and contains 8 parts, one for each bit in the matched byte.
The most significant bit should be 0. Whatever the next bit is, is set in the value IS_POWER_NOT_ZEROED, which can then be used in other expressions or by the developer later on. Then a 1 bit must be available.

Next comes a bit, which if set, sets IS_UNIT_UK_POUND to 1 and IS_UNIT_KILOGRAM to 0, else it sets IS_UNIT_KILOGRAM to 1 and IS_UNIT_UK_POUND to 0.

The next bit is set negated to the value IS_SETTLED. So if the bit was 1, then IS_SETTLED is set to 0 and visa versa.

The 3 remaining bits sets IS_OVERLOAD, IS_NEGATIVE and IS_NETTO values.

Simple stuff, right? 🙂

Now let us look at the W tag. It’s defined to be a string type tag, which means that any masks we write must be written as strings, and any value matched is seen as a string (a collection of bytes). As the W tag do not have any masks defined, the tag mask is considered a match, and any optional expression on that tag is run. In this case we just set the value WEIGHT equal to the complete matched value.

That introduce the magic word VALUE. It is a special variable, which always contains the latest match, regardless if it is a bits or strings match. In this case, it is how we get the

When all matches has been successful, we have a matching telegram, and only then will all the matching telegrams expressions be run. Internally kbmMW Binary Parser uses the kbmMemTable SQL and expression parser, and as such can do all the things that the expression parser can do, including calling functions etc.

We miss the code to run the parser.

     rd:=TkbmMWBPFileReader.Create(eDefFile.Text);
     try
        rd.OnSkipping:=procedure(var AByteCount:integer)
                       begin
                          Log.Info('Skipping '+inttostr(AByteCount));
                       end;

        // If you want to see the parsed values on a positive match.
        rd.OnMatch:=procedure(AValues:IkbmMWBPValues; var ABreak:boolean; const ASize:integer)
                    var
                       a:TArray;
                       i:integer;
                       row:integer;
                    begin
                         grid.DefaultRowHeight:=25;
                         grid.RowCount:=AValues.Count+1;
                         row:=1;
                         a:=AValues.Names;
                         TArray.Sort(a);
                         for i:=0 to High(a) do
                         begin
                              grid.Cells[0,row]:=a[i];
                              grid.Cells[1,row]:=VarToStr(AValues.Value[a[i]]);
                              inc(row);
                         end;
                         if AValues.Value['IS_OVERLOAD'] then
                            Log.Info('Overload')
//                         else if AValues.Value['IS_SETTLED'] then
//                              Log.Info('Unsettled gross:'+VarToStr(AValues.Value['GROSS']))
                         else
                             Log.Info('Gross:'+VarToStr(AValues.Value['GROSS'])+' Settled:'+vartostr(AValues.Value['IS_SETTLED']));
//                         ABreak:=true; // Only return first match.
                    end;

        rd.Run(eDataFile.Text);
        Log.Info('Found '+inttostr(rd.MatchCount)+' matches. Skipped '+inttostr(rd.SkippedBytes)+' bytes');
     finally
        rd.Free;
     end;

We take advantage of that a file reader is made available, that makes it easy to parse large files. But one could just as easily have created any other type readers, descending from TkbmMWBPCustomReader.

Each time the parser is not able to parse something successfully, it will attempt to skip past it, until either a match is made, or all data has been processed. The OnSkipping event is called on those occasions.

When a match is made, the OnMatch event is called. The developer can choose what to do with the values and if the parsing should continue when the event is done.

The file reader accepts one argument, the name of the definition file. And what is being read, is the file with the filename given in the Run statement.

Run will continue to run, until either all data has been read, or the process is interrupted, by either setting a zero value for AByteCount in OnSkipping, or setting ABreak to true in OnMatch.

After the execution ends, you can explore how many bytes was skipped and how many telegrams was read in total.

The parser is likely to evolve as new requirements appear, and I encourage users of it to play an active role in extending it so we all can benefit from a very versatile binary parser.

The CHM beast – kbmMW – 24.000+ topics

The vast toolbox kbmMW contains more than 24.000 topics in the autogenerated CHM documentation.

For your browsing pleasure, I have auto generated a Windows CHM help for kbmMW Enterprise Edition, however only with a subset of database adapter and transport adapter features enabled. Thus you will only see database support for SQLite, MD (virtual data set) and MT (kbmMemTable).

Remember there are 33+ additional database adapters and another 5-10 transport adapters to choose between, which are not documented in this CHM, although they follow similar structure as those in this CHM.

In all there are more than 24.000 topics.

For the curious, there may be a couple of interesting hints in it, about what is about to come for kbmMW 🙂

Let the browsing and guessing begin!

The unsupported CHM file is here: kbmMW_CHM

 

REST easy with kbmMW #5 – Logging

Following up on the previous blog posts about how easily to create a REST server with kbmMW, I today want to write a little bit about logging.

kbmMW contains a quite sophisticated logging system, which lets the developer log various types of information whenever the developer needs it, and at runtime lets the administrator decide what type of log to react on and how.

In addition the log can be output in a file, in the system log (OS dependant), or be sent to a remote computer for storage. In fact all the above methods can coexist at once.

What’s the purpose of logging?

Well. There can be multiple purposes, amongst others:

  • For debugging while developing
  • For debugging after deployment
  • For keeping track of resources
  • For keeping track of usage (perhaps even relates to later invoicing)
  • For proving reasons for user complaints
  • Of security reasons to track who is doing what

As you can tell, there seems to be various log requirements for various stages of the lifetime of the application:

  • During development
  • During usage
  • Early warning
  • Post incident investigation

A good log system should imo handle all the above scenarios, while making it simple to use for the developer, and allow the administrator to tune on the amount of information needed.

kbmMW’s log system handles all these scenarios, and can be late fine tuned for the required log level.

In addition the log system should be able to output the log in relevant formats, that match the application’s purpose.

Web server applications, might want to output some log data in a format generally accepted by web servers, and thus also by web server log file analyzer software, while other server applications may have other requirements for output.

kbmMW supports several output formats, and also allows adding additional formats, without having to make changes in the developer’s logging statements.

So let us get on with it.

First add the kbmMWLog unit to the units in which you expect to do some logging.

In our case, we have the units Unit7 (main form unit), Unit8 (Smart service unit… the actual REST business code) and Unit9 (a defined sharable TContact object).

It makes sense to add support for logging in Unit7 and Unit8. In Unit7 it would look similar to this:

interface

uses
 Winapi.Windows, Winapi.Messages, System.SysUtils, System.Variants, System.Classes, Vcl.Graphics,
 Vcl.Controls, Vcl.Forms, Vcl.Dialogs, kbmMWCustomTransport, kbmMWServer,
 kbmMWTCPIPIndyServerTransport, kbmMWRESTTransStream,
 kbmMWCustomConnectionPool, kbmMWCustomSQLMetaData, kbmMWSQLiteMetaData,
 kbmMWSQLite, kbmMWORM, IdBaseComponent, IdComponent, IdServerIOHandler, IdSSL,
 IdSSLOpenSSL, IdContext, kbmMWSecurity, kbmMWLog;

And in Unit8 we have also added kbmMWLog to the uses clause.

By simply adding this unit, we can already log by calling one of the methods of the public default available Log instance. Eg.

Log.Debug('some debug information');
Log.Info('2 + 2 = %d',[2+2]);

kbmMW’s log system supports these easy access methods:

  • Debug (typically used during development purposes),
  • Info (inform about some non critical and non error like information)
  • Warn (inform about some non critical anormal situation)
  • Error (inform about some error, like an exception or something else which still allow the application to continue to operate)
  • Fatal (inform about an error of such magnitude that the application no longer can run).
  • Audit (inform about some information that you want to be used as evidence in a post analysis scenario).

They in turn calls a number of generic TkbmMWLog.Log method which takes arguments for log type, severity, timestamps and much more.

You can ask kbmMW to log content of streams, of memory buffers, XML and JSON documents, byte arrays, and you can even ask kbmMW to produce a stack trace along with your log (not currently supported on NextGen platforms).

In our simple REST server, we might want to log whenever a user logs in, when they are logged out, when a function is called, and when an exception happens.

To intercept the login situation, we will write some event handlers for the OnLoginSuccess and OnLoginFailed event on the TkbmMWAuthorizationManager instance we have on Unit7.

procedure TForm7.kbmMWAuthorizationManager1LoginFail(Sender: TObject;
 const AActorName, ARoleName, AMessage: string);
begin
 Log.Warn('Failed login attempt as %s with role %s. %s',[AActorName,ARoleName,AMessage]);
end;

procedure TForm7.kbmMWAuthorizationManager1LoginSuccess(Sender: TObject;
 const AActorName, ARoleName: string; const AActor: TkbmMWAuthorizationActor;
 const ARole: TkbmMWAuthorizationRole);
begin
 Log.Info('Logged in as %s with role %s',[AActorName,ARoleName]);
end;

It makes sense to log a successful login as an information, while an unsuccessful login is logged as a warning. If it happens often, it could be malicious login attempts, so warnings ought to be looked after.

And we might also want to log a logout of a user. The logout may happen automatically due to the user being idle for too long. Refer to the previous blog post for more information.

We might also want to log what calls are made by logged in users.

This can be done in many ways and many places. You could choose to do it within your business logic code in the smart service in Unit8, which makes sense if you want to log some more specific information about the call.

But if you just want to log successful and failed calls, then it’s easy to do so using the OnServeResponse event of the TkbmMWServer instance in Unit7.

As long as the request is formatted correctly and thus served through the TkbmMWServer, it will be attempted to be executed, and a response sent back to the caller.

The execution may succeed or it may fail, but in all cases the OnServeResponse event will be triggered.

procedure TForm7.kbmMWServer1ServeResponse(Sender: TObject;
 OutStream: IkbmMWCustomResponseTransportStream; Service: TkbmMWCustomService;
 ClientIdent: TkbmMWClientIdentity; Args: TkbmMWVariantList);
begin
 if OutStream.IsOK then
   Log.Info('Successfully called %s on service %s',[ClientIdent.Func,ClientIdent.ServiceName])
 else
   Log.Error('An error "%s" happened while serving request for %s on %s',[ClientIdent.Func,ClientIdent.ServiceName,OutStream.StatusText]);
end;

Now we intercepts and logs at strategic places in our code, and in fact the logging is already working. But the log output is currently only placed on the system log, which on Windows is interpreted as the debugger.

We need to have our log output to a file, preferably with nice chunking when the file reach a certain size.

The responsibility of the actual output, is the log manager. There are a number of log managers included with kbmMW:

  • TkbmMWStreamLogManager – Sends log to a TStream descendant.
  • TkbmMWLocalFileLogManager – Sends log to a file.
  • TkbmMWSystemLogManager – Sends log to system log (depends on OS).
  • TkbmMWStringsLogManager – Sends log to a TStrings descendant.
  • TkbmMWProxyLogManager – Proxies log to another log manager.
  • TkbmMWTeeLogManager – Sends log to a number of other log managers.
  • TkbmMWNullLogManager – Sends log to the bit graveyard.

If you have kbmMW Enterprise Edition and thus also have access to the WIB (Wide Information Bus) publish/subscribe transports, you have a couple of additional log managers available for remote logging:

  • TkbmMWClientLogManager – Publishes logs via the WIB
  • TkbmMWServerLogManager – Subscribes for logs on the WIB, and forwards those through other log managers.

You can make your own log manager by descending from TkbmMWCustomLogManager and implementing the IkbmMWLogManager interface.

To use a different log manager than the default system log manager, you simply create an instance of the log manager you want to use and assign it to the TkbmMWLog.Log.LogManager property. Eg.

Log.LogManager:=TkbmMWLocalFileLogManager.Create('c:\temp\mylogfile.log');

However to set specific settings on the log manager, it is better to instantiate a variable with it, set its properties and then later assign that variable to the Log.LogManager property.

An even easier way, is to use one of the Log.Output….. methods, which easily creates relevant log managers for you with settings that usually are good for most circumstances. Eg.

Log.OutputToDefaultAndFile('c:\temp\mylogfile.log');

This will in fact create 3 log managers, a system log manager, a file log manager and a tee log manager and automatically hooks them all up.

In our case we just want to output to a file, so let us stick with the TkbmMWLocalFileLogManager. So we will simply create an instance and assign it to the Log.LogManager as shown above.

Now all the log will be output to the file, and the file will automatically be backed up and a new created when it reaches 1MB size. Backup naming and size etc. are all configurable on the TkbmMWLocalFileLogManager instance.

You can control which fields are output via the Log.LogManager.LogFormatter property. It is default a TkbmMWStandardLogFormatter. kbmMW also supports a TkbmMWSimpleLogFormatter which only outputs date/time, type and the actual log string.

The standard log formatter also outputs data type, process and thread information and binary data (usually converted to either Base64 or hexdump (pretty) format).

There are much more to logging. We didn’t touch the fact that the log system can handle separate log files for auditing and other logging, and that you can set filtering on each log manager so that particular log manager only logs certain log types or log levels or data types.

Happy logging.

ANN: kbmMW Professional and Enterprise Edition v. 5.02.00 released!

We are happy to announce v5.02 of our popular middleware for Delphi and
C++Builder.

If you like kbmMW, please let others know! Share the word!

We strive hard to ensure kbmMW continues to set the bar for what an n-tier product must be capable of in the real world!

Keywords for this release:

  • Many ORM improvements
  • New! CRON compatible scheduler support
  • Synchronous encryption improvements
  • AMQP improvements
  • Many other improvements and bugfixes for reported bugs

Please look in the end of this post for a detailed change list.

Professional and Enterprise Edition is available for all with a current
active SAU. If your SAU has run out, please visit our shop to extend it with another
12 months.

CodeGear Edition is available for free, but only supports a specific Delphi/Win32 SKU, contains a limited feature set and do not include source.

Please visit https://portal.components4developers.com to download.

—-

kbmMW is the premiere n-tier product for Delphi, C++Builder and FPC on .Net, Win32, Win64, Linux, Java, PHP, Android, IOS, embedded devices, websites, mainframes and more.

Please visit http://www.components4developers.com for more information about kbmMW.

—-

Components4Developers is a company established in 1999 with the purpose of providing high quality development tools for developers and enterprises. The primary focus is on SOA, EAI and systems integration via our flagship product kbmMW.

kbmMW is a portable, highly scalable, high end application server and enterprise architecture integration (EAI) development framework for Win32, ..Net and Linux with clients residing on Win32, .Net, Linux, Unix, Mainframes, Minis, Embedded and many other places. It is currently used as the backbone in hundreds of central systems, in
hospitals, courts, private, industries, offshore industry, finance, telecom, governements, schools, laboratories, rentals, culture institutions, FDA approved medical devices, military and more.


5.02.00 May 27 2017

Important notes (changes that may break existing code)
======================================================
* Changed Use class in kbmMWSmartUtils.pas. Now it will use TkbmMWAutoValue internally
to store data. Since data stored in TkbmMWAutoValue is reference counted and scoped,
access to the data is slightly different.
Use AsObject to return a reference to the object. Ownership of the object belongs to the TkbmMWAutoValue container.
Use AsMyObject to return a reference to the object and mark it as your object. You will be responsible for freeing it.

New stuff
=========
– Added IkbmMWAutoValue and TkbmMWAutoValue to kbmMWGlobal.pas. They handle scope based object life time handling.
– Changed smart object’s (TkbmMWMarshalledVariantData) to use TkbmMWAutoValue.
– Updated DumpVariant in kbmMWGlobal.pas to dump smart object’s too.
– Added support for TkbmMWTiming on IOS.
– Added support for REST tags anonymousRoot=true/false and pretty=true/false which
can be used to control if resulting objects should be anonymous or contained in a
parent object, and if the result should be prettyformatted or not (default).
Prettyformatting is not implemented on JSON at the time.
– Updated AMQP protocol to default not write shortstrings, unsigned 8bit, unsigned16 bit
unsigned32 bit and unsigned 64 bit. Reason is that although AMQP v. 0.9.1
should support them, industry dont, why most AMQP implementations will not understand those types.
It is possible to uncommnet a number of defines in top of kbmMWAMQP.pas to selectively
enable these types. If they are commented, kbmMW auto propogates the value to the next
sensible type.
– Updated kbmMWAMQP.pas to support copying field tables instead of assigning them.
– Added Safe property to TkbmMWMixerPasswordGen. If set to true, it will not use
digits and characters that can be visually misread (0 vs O etc).
– Added OnMessageProcessingFailed event to TkbmMWCustomSAFClientTransport and
TkbmMWCUstomSAFServerTransport and published in descending classes.
It will be called when message processing failed, for example if kbmMW is
unable to decrypt a message.
– Added support for dynamic arrays in object marshalling.
– Added support for Notify in TkbmMWDateTime and kbmMWNullable. If set (in an ORM scenario)
the client will be notified about the value in that particular field.
– Modified and fixed timezone initialization in kbmMWDateTime.pas.
– Added OutputToDefaultAndFile and OutputToDefaultAndStringsAndFile to TkbmMWLog for
easy setup of outputs.
– Enhanced TkbmMWCustomCrypt to support PassPhraseBytes (which if set, takes precedence over
PassPhrase (string).
– Added OnEncryptKeys, OnDecryptKeys, OnDecryptStatus events to TkbmMWCustomCrypt to allow for
attempting various keys before finally either succeeding or giving up.
This can be valuable in supporting client unique encryption/decryption.
– Added a number of GetDefAs…. methods to TkbmMWONArray and TkbmMWONObject which
returns a default value if the property/index is missing instead of raising an exception.
– Added GlobalIndexNames property to TkbmMWCustomSQLMetaData. If set kbmMW’s SQL rewriter
knows that index names must be database scope unique, instead of only table scope unique.
– Added Init function that accepts a string as salt to TkbmMWCustomHash.
– Added GetDigest to TkbmMWCustomHash, which returns a byte array with the digested hash.
Its an alternative to using Final.
– Added OnDisconnected and OnException events to TkbmMWAMQPClientConnection.
– Added OnConnect, OnDisconnect, OnDisconnected and OnException events to TkbmMWAMQPClient.
– Added mwsloDateTimeIsUTC to TkbmMWSQLiteOption. Determines how to interpret date time values, as local time or as UTC time.
– Added support for boolean parameter values in TkbmMWSQLite.
– Improved marshalling of kbmMWNullable types.
– Added kbmMWSubjectGetType, kbmMWSubjectExtractNodeID and
kbmMWGenerateMessageSubscriptionSubject to kbmMWSubjectUtils.pas
– Added mwrieNotify to TkbmMWRecordInfoEvent in kbmMWCustomDataset.pas
– Added support for TIMESTAMP datatype in SQL datatype deduction.
– Added support for returning an interfaced object from smart services.
– Added field change detection to TkbmMWFieldDefs.
– Improved TkbmMWRTTI.InstantiateValue in kbmMWRTTI.pas.
– Improved kbmMWNullable.
– Changed Use class in kbmMWSmartUtils.pas. Now it will use TkbmMWAutoValue internally
to store data. Since data stored in TkbmMWAutoValue is reference counted and scoped,
access to the data is slightly different.
Use AsObject to return a reference to the object. Ownership of the object belongs to the TkbmMWAutoValue container.
Use AsMyObject to return a reference to the object and mark it as your object. You will be responsible for freeing it.
– Added methods ToDataset, FromDataset, ListFromDataset to TkbmMWSmartClientORM.
Provides an easy way to convert arguments and results to and from datasets.
– Added Cron fluent method to IkbmMWScheduledEvent. It accepts a 5 or 6 part Unix cron value
which defines the interval.
– Added AtYears, AtMonths, AtDays, AtHours, AtMinutes, AtSeconds methods to IkbmMWScheduledEvent
to give an alternative way to provide cron like schedules.
– Added SynchronizedAfterRun and AfterRun methods to IkbmMWScheduledEvent to
provide an anonymous function to be called after the schedule has run.
It is particular valuable on scheduling asynchronous operations via RunNow,
followed up with updating something with the result of the function.
– Added TkbmMWONSchedulerStorage for storing/retrieving schedules in any object notation format.
– Added support for subscribing for raw messages using anonymous function in WIB.
– Added Delete to TkbmMWORM taking primary key values alternative specific field values.
– Added support for many more date formats for ORM data generators.
In addition to LOCAL, UTC and ISO8601, also RFC1123, NCSA, LOCALSINCEEPOCHMS,
UTCSINCEEPOCHMS, LOCALSINCEEPOCH and UTCSINCEEPOCH is supported.
– Generally many additional improvements on ORM.

Fixes
=====
– Fixed default true/false values for TkbmMWSQLiteMetaData.
– Fixed HTTP/REST/AJAX additional incorrect CRLF in output.
– Fixed serious bug in 32 bit random generators (kbmMWRandom.pas).
– Fixed NextGen issues in some parsing routines in kbmMWDateTime.pas.
– Fixed bugs in Query service wizard.
– Fixed some SQL rewriting bugs including adding support for DESCENDING order by.

REST easy with kbmMW #4 – Access management

Building on the previous articles about how to create a REST server using kbmMW, we have now reached the stage where we should consider access management.

What is access management? It’s the “science of who are allowed to do what.

It is obvious that data exists in this world, which should be protected from being read, created or altered by people/processes we have not authorized to do so. Or turned on its head, some data should be protected and be accessible only by people/processes that we trust.

Other data might be left freely available for reading, but never for modifying and so forth.

Fortunately kbmMW have features built in to support us with that.

We start by adding a TkbmMWAuthorizationManager to the main form (Unit7 in the previous posts).2017-05-26 15_07_56-Project6 - RAD Studio 10.1 Berlin - Unit7.png

We can use the authorization manager as is, standalone, but it often makes sense to connect it to the kbmMWServer instance. Thus set the property kbmMWServer1.AuthorizationManager to point on kbmMWAuthorizationManager1.

This way, every call into the application server will checked by the authorization manager for access rights.

The kbmMW authorization manager is an entity which understands the topics:

  • resource
  • actor
  • role
  • authorization
  • constraint
  • login

A resource is basically anything that you want to add some sort of protection for. It can be database related, it can be a specific object, it can be a function or a service that you want to ensure is only handled in ways that you want it to, by people/processes that you have granted access to it. Resources can be grouped in resource trees, where having access to one resource also automatically provides same access to resources underneath that resource.

An actor, is typically a person (or a person’s login credentials), a process or something else that identifies “someone” that want access to your resource’s.

A role is a way to categorize general access patterns. Roles in a library, could be a librarian, an administrator and a loaner. Roles in a bank could be a customer, a teller, a clerk, an administrator and so forth. The idea is that each of the roles will have different access rights to the various resources. Actors usually will be given at least one role. An actor can have different roles, for example depending on how the actor log’s in, or from where.

An authorization is a “license” to operate as an actor or a role on a specific resource. An authorization can be negative, thus specifically denying an actor or role access to specific resources and their subtrees.

A constraint is a limitation to an authorization or to a login. The authorization may only be valid within a specific timeframe, or be allowed to be accessed from specific equipment and such, or the login can only happen during daytime etc.

A login is the match between an actor/password and a login token. When an actor is attempting to be logged in, the system verifies login name, password, requested role and whatever constraints has been defined related to login in. Only when everything has been checked up and a login is allowed, a token is issued, which the actor/user/process will need to send along with every request it makes to the kbmMW based server.

So let us define two roles we want to have access to our REST server. We can choose to name them ‘Reader’ and ‘ReadWriter’, but as kbmMW do not pose any restrictions to naming of roles (nor on actors and resources), we can name them anything as long as the names are unique within their category (roles, actors, resources).

  • Reader
  • ReadWriter

In code we define the roles like this (for example in the OnCreate event of the main form:

 kbmMWAuthorizationManager1.AddRole('READER');
 kbmMWAuthorizationManager1.AddRole('READWRITER');

We also, somehow, need to tell the authorization manager which actors exists so it can match up login attempts with actors.

The simple way is to predefine them to the authorization manager. That can for example also happen in the OnCreate event of the form, or elsewhere before the first access to the server. The actors can be defined from a database or a configuration file or LDAP etc. as needed.

 kbmMWAuthorizationManager1.AddActor('HANS','HANSPASSWORD','READER');
 kbmMWAuthorizationManager1.AddActor('CHRISTINE','CHRISTINEPASSWORD','READWRITER');

This defines two actors with their passwords, and which role they should act as upon login if they do not specifically ask for a different role.

It is possible not to predefine actors, but instead use an event handler to verify their existence in a different system via the OnLogin event of the kbmMWAuthorizationManager1 instance.

procedure TForm7.kbmMWAuthorizationManager1Login(Sender: TObject;
 const AActorName, ARoleName: string; var APassPhrase: string;
 var AActor: TkbmMWAuthorizationActor; var ARole: TkbmMWAuthorizationRole;
 var AMessage: string);
begin
...
end;

An AActorName and the requested role name in ARoleName is provided. Optionally an actor instance may also be provided, if the actorname is known to kbmMW. If not, AActor is nil, and must be created by you if you know about the actor.

ARole may be nil, if it’s an unknown role that is requested. You can choose to define the role on the fly by returning a newly created TkbmMWAuthorizationRole instance. Remember to add any newly created actor or role instances to the kbmMWAuthorizationManagers Actors and Roles list properties before returning.

APassword will contain the password delivered with the login attempt. You are allowed to modify it on the fly (for example to change it to a SHA256 hash, so no human readable passwords are stored in the authorization manager).

If you return nil for AActor or ARole, then it means that the login failed. You can provide an explanation in the AMessage argument if you want.

But let us continue with our simple actor definition for this sample

Now that we have actors and roles defined, the authorization manager is ready to handle login attempts.

There is only one way to login, and that is by calling the Login method of the authorization manager. This can, for example, be called from a new REST function in your REST service.

An alternative is to let kbmMW automatically detect login attempts, and call the Login method for you. To do that, set the Options property of kbmMWAutorizationManager1 to [mwaoAutoLogin].

As you may remember, all requests to the kbmMW server must be accompanied with a Token identifying a valid login. If that token is not available, kbmMW (with mwaoAutoLogin set), is triggered to use whatever username/password passed on from the caller, as data for a login attempt and will return the token back to the called if the login succeeded.

As a REST server is essentially a web server, adhering to the HTTP protocol standards, what happens when kbmMW detects an invalid (or non existing) login, is that kbmMW will raise an EkbmMWAuthException, which in turn (when the call comes via the REST streamformat), will be translated into an HTTP error 401, which is presented to the caller. In fact, if you would raise that exception anywhere within your business code and you do not manage it yourself, it will automatically be forwarded to the caller as a 401.

This will prompt most browsers to present a login dialog, where username/password can be entered, and next call to back to the server, will include that login information. kbmMW will automatically detect this and use it.

So we have actor, role and login in place. Now we need to determine what resources we have. A resource can be anything you want to tag a unique name on.

Most of the time, it makes sense to define REST methods as a resource. This is done very easily in our smart service, where we have the functions for manipulating and retrieving contacts (Unit8). We use the kbmMW_Auth attribute.

[kbmMW_Service('name:MyREST, flags:[listed]')]
[kbmMW_Rest('path:/MyREST')]
TkbmMWCustomSmartService8 = class(TkbmMWCustomSmartService)
 public
  [kbmMW_Auth('role:[READER,READWRITER], grant:true')]
  [kbmMW_Rest('method:get, path:helloworld, anonymousResult:true')]
  [kbmMW_Method]
  function HelloWorld:TMyResult;

  [kbmMW_Auth('role:[READER,READWRITER], grant:true')]
  [kbmMW_Rest('method:get, path:contacts, anonymousResult:true')]
  function GetContacts:TObjectList<TContact>;

  [kbmMW_Auth('role:[READWRITER], grant:true')]
  [kbmMW_Rest('method:put, path:addcontact')]
  function AddContact([kbmMW_Rest('value:"{$name}"')] const AName:string;
    [kbmMW_Rest('value:"{$address}"')] const AAddress:string;
    [kbmMW_Rest('value:"{$zipcode}"')] const AZipCode:string;
    [kbmMW_Rest('value:"{$city}"')] const ACity:string):string; overload;

  [kbmMW_Auth('role:[READWRITER], grant:true')]
  [kbmMW_Rest('method:get, path:"addcontact/{name}"')]
  function AddContact([kbmMW_Rest('value:"{name}"')] const AName:string):string; overload;

  [kbmMW_Auth('role:[READWRITER], grant:true')]
  [kbmMW_Rest('method:delete, path:"contact/{id}"')]
  function DeleteContact([kbmMW_Rest('value:"{id}"')] const AID:string):boolean;
 end;

 

What happens behind the scenes is that kbmMW automatically define resource names for the functions like this: MyREST..AddContect, MyREST..GetContacts etc.

Notice the extra dot! If we had defined the service to have a version, when we created it, that would be put between the dots.

As you can see, the resource name is just a string, and you can define all the resources you want to yourself, but know that if you use kbmMW smart services, it will automatically define resource names in the above format.

kbmMW will also automatically ask the authorization manager to validate that it is allowed to use a resource, upon a call from any client.

You can choose to make finer grained authorization by manually calling the authorization manager for validation of a call like this:

var
  res:TkbmMWAuthorizationStatus;
  sMessage:string;
begin
...
  res:=AuthorizationManager1.IsAuthorized(logintoken, 'YOURRESOURCENAME', sMessage);

res can have the value of mwasAuthorized, mwasNotAuthorized or mwasConstrained.

mwasConstained means that the authorization might be given under different circumstances (different time on day or similar). The returned sMessage may explain in more detail what was the reason that the access was denied.

In a kbmMW smart service, you can get the login token (logintoken) as an argument to the method like this:

 [kbmMW_Auth('role:[READER], grant:true')]
 [kbmMW_Rest('method:get, path:"someCall"')]
 function SomeCall([kbmMW_Arg(mwatToken)] const AToken:string):boolean;

When the SomeCall method is called, its AToken argument contains the logintoken.

You can also access the services ClientIdentity.Token property instead from within your methods if you do not want the token to be part of the argument list of your method call.

Now your REST server is protected by SSL and calls to it’s functionality limited by login.

There are many more features in the authorization manager, which I have not explained here, but visit our site at http://www.components4developers.com, and look for the kbmMW documentations section for whitepapers.

If you like this, please share the word about kbmMW wherever you can and feel free to link, like, share and copy the posts of this blog to where you find they could be useful.

 

 

REST easy with kbmMW #1

REST servers are very easy to make using kbmMW.

First we make a server application (or service… it’s up to you).

In this case we will add a simple form both providing a GUI and a place for our kbmMW components.

In Delphi click File – New – VCL Forms application

Add one of each of the following kbmMW components to the form:

  • TkbmMWServer
  • TkbmMWTCPIPIndyServerTransport

2017-05-20 14_50_54-Project6 - RAD Studio 10.1 Berlin - Unit7.png

Set the Server property of kbmMWTCPIPIndyServerTransport1 to kbmMWServer1.

Double click the Bindings property of kbmMWTCPIPIndyServerTransport1 to open its editor. Add a binding for 0.0.0.0 port 80, which is the default HTTP server port. You can choose any other binding you want, but make sure to tell your REST users which port to access.2017-05-20 14_53_34-Project6 - RAD Studio 10.1 Berlin - Unit7.png

Set the Streamformat property of kbmMWTCPIPIndyTransport1 to REST.2017-05-20 14_56_15-Project6 - RAD Studio 10.1 Berlin - Unit7.png

Save your project. Saving it will prompt Delphi to update your uses section with required units.

Double click on the form, to create a form OnCreate event handler.

Enter two lines of code so it looks like this:

procedure TForm7.FormCreate(Sender: TObject);
begin
 kbmMWServer1.AutoRegisterServices;
 kbmMWServer1.Active:=true;
end;

Locate the interface section’s uses clause and add an additional unit kbmMWRESTTransStream.

Save again.

Now your unit code should look similar to this:

unit Unit7;

interface

uses
 Winapi.Windows, Winapi.Messages, System.SysUtils, System.Variants, System.Classes, Vcl.Graphics,
 Vcl.Controls, Vcl.Forms, Vcl.Dialogs, kbmMWCustomTransport, kbmMWServer,
 kbmMWTCPIPIndyServerTransport, kbmMWRESTTransStream;

type
 TForm7 = class(TForm)
 kbmMWServer1: TkbmMWServer;
 kbmMWTCPIPIndyServerTransport1: TkbmMWTCPIPIndyServerTransport;
 procedure FormCreate(Sender: TObject);
 private
 { Private declarations }
 public
 { Public declarations }
 end;

var
 Form7: TForm7;

implementation

{$R *.dfm}

procedure TForm7.FormCreate(Sender: TObject);
begin
 kbmMWServer1.AutoRegisterServices;
 kbmMWServer1.Active:=true;
end;

end.

Basically we now have the foundation for a REST capable web server.

Let’s add some functionality that can be called from any REST client.

In Delphi, Click FileNewOtherComponents4Developers Wizards and select the kbmMW Service Wizard. Click OK.

2017-05-20 15_09_18-Project6 - RAD Studio 10.1 Berlin - Unit7.png

Before continuing to select the type of kbmMW service we will add, we need to decide what type of REST server we want to create. It can be a pure REST server, which only serves data from your own code, or it can be a regular web server, which will also be able to serve data from files on disk, like html templates, images, CSS files etc.

For our purpose we just want to make a pure REST server, so we select
Smart service/kbmMW_1.0.

If you wanted it to be able to serve files from disk, and even proxy requests on to other FastCGI compatible servers (like PHP etc) you would have chosen HTTP Smart service.

2017-05-20 15_13_56-kbmMW service wizard.png

Click the funny looking next button. Type in the default name your REST service should be known as. In this sample, I’ve called it MyREST.

2017-05-20 15_15_16-Project6 - RAD Studio 10.1 Berlin - Unit7.png

Click next until you get to this page, then click the green checkmark button

2017-05-20 15_16_42-kbmMW service wizard.png

Now an almost empty service has been generated for you.

On the surface it looks like a regular TDataModule, and as such can contain any component that you can put on a TDataModule. But right now we are more interested in its code. Press F12 to switch to code view.

Browse past the explanatory remarks at the top, until you get to the actual code.

type

[kbmMW_Service('name:MyREST, flags:[listed]')]
[kbmMW_Rest('path:/MyREST')]
 // Access to the service can be limited using the [kbmMW_Auth..] attribute.
 // [kbmMW_Auth('role:[SomeRole,SomeOtherRole], grant:true')]

TkbmMWCustomSmartService8 = class(TkbmMWCustomSmartService)
 private
 { Private declarations }
 protected
 { Protected declarations }
 public
 { Public declarations }
 // HelloWorld function callable from both a regular client,
 // due to the optional [kbmMW_Method] attribute,
 // and from a REST client due to the optional [kbmMW_Rest] attribute.
 // The access path to the function from a REST client (like a browser)+
 // is in this case relative to the services path.
 // In this example: http://.../MyREST/helloworld
 // Access to the function can be limited using the [kbmMW_Auth..] attribute.
 // [kbmMW_Auth('role:[SomeRole,SomeOtherRole], grant:true')]
 [kbmMW_Rest('method:get, path:helloworld')]
 [kbmMW_Method]
 function HelloWorld:string;
 end;

implementation

uses kbmMWExceptions;

{$R *.dfm}


// Service definitions.
//---------------------

function TkbmMWCustomSmartService8.HelloWorld:string;
begin
 Result:='Hello world';
end;

The interesting bits are shown above in bold.

If you compile and run your application now, you have a REST only capable webserver which have one function… HelloWorld taking no arguments, and that returns a string.

Open up your favorite web browser and lets test the function by typing this in the address field:

http://localhost/MyREST/helloworld

Make sure that case is correct, since the HTTP standard describes that the URL part of the address must be case sensitive. If you would write http://localhost/MyREST/HelloWorld instead you would be told that the request is invalid.

This is all nice… but my REST client expect to receive a JSON object, not just simple text.

Ok.. I’ll show 3 ways to do that… the very manual way, the semi automated way and the fully automated way.

The manual way. Change the HelloWorld function to look like this:

function TkbmMWCustomSmartService8.HelloWorld:string;
begin
 Result:='{''result'':''Hello world''}';
end;

The REST client will receive an anonymous object with a property called result, containing “Hello world”.

The semi automated way:

uses kbmMWExceptions
 ,kbmMWObjectNotation
 ,kbmMWJSON;

{$R *.dfm}


// Service definitions.
//---------------------

function TkbmMWCustomSmartService8.HelloWorld:string;
var
 o:TkbmMWONObject;
 jsonstreamer:TkbmMWJSONStreamer;
begin
 o:=TkbmMWONObject.Create;
 jsonstreamer:=TkbmMWJSONStreamer.Create;

 o.AsString['result']:='Hello world';
 Result:=jsonstreamer.SaveToUTF16String(o);

 jsonstreamer.Free;
 o.Free;
end;

This allows you to create complex JSON documents pretty easily. The cool part is that since we use kbmMW’s object notation framework, we could have chosen to stream it as XML or YAML or BSON or MessagePack instead by simply instantiating the appropriate streamer.

The automated way:

type

TMyResult = class
private
 FResult:string;
public
 property Result:string read FResult write FResult;
end;

 [kbmMW_Service('name:MyREST, flags:[listed]')]
 [kbmMW_Rest('path:/MyREST')]
 // Access to the service can be limited using the [kbmMW_Auth..] attribute.
 // [kbmMW_Auth('role:[SomeRole,SomeOtherRole], grant:true')]

TkbmMWCustomSmartService8 = class(TkbmMWCustomSmartService)
 private
 [kbmMW_Rest('method:get, path:helloworld, anonymousResult:true')]
 [kbmMW_Method]
 function HelloWorld:TMyResult;
 end;

implementation

uses kbmMWExceptions;

{$R *.dfm}

// Service definitions.
//---------------------

function TkbmMWCustomSmartService8.HelloWorld:TMyResult;
begin
 Result:=TMyResult.Create;
 Result.Result:='Hello world';
end;

initialization
 TkbmMWRTTI.EnableRTTI(TkbmMWCustomSmartService8);
 kbmMWRegisterKnownClasses([TMyResult]);
end.

The automated way simply means returning an object with the desired information. kbmMW will automatically convert the object to JSON (because we are using the REST streamformat).

To make sure that kbmMW knows about the object type, we register it via the kbmMWRegisterKnownClasses. If we didn’t, kbmMW would complain that it do not know about the object.

Do not worry about the TMyResult instance being leaked. kbmMW will automatically free it when it goes out of scope. If you specifically do not want the returned object to be freed by kbmMW, you can tell so by including freeResult:false in the kbmMW_Rest attribute for the HelloWorld method.

Also notice that the kbmMW_Rest attribute now includes anonymousResult:true.

This tells kbmMW that we want the resulting JSON to be anonymous. If we didn’t include that attribute setting, the result would have looked like this:

{"TMyResult":{"Result":"Hello world"}}

Which is not necessarily wrong, but different.

There are lots of control options of how the object should be streamed by setting various attributes on it. One can for example choose that the Result property should be returned under a different name etc.

kbmMW also understands returning TkbmMemTable instances, arrays and many other types of information, so it is really easy to REST’ify your kbmMW business functionality with almost no lines of additional code.

As a final comment, since the HelloWorld method is also tagged with the attribute [kbmMW_Method], it is also callable by native kbmMW clients.

Feel free to share, comment and like this post 🙂