Modernization of Mainframe applications by implementing cloud based agile architecture

This article discusses how legacy mainframe applications can be modernized to enable teams to work in Devops/Agile mode for faster
development/deployment without rewriting or porting mainframe code which is very expensive and time consuming.

Below are the list of proposals to uplift mainframe water fall work processes to Agile/Devops methodology. Each point below address shortfalls in a mainframe based environment which need to be addressed before adopting Devops/Agile practices or tools.

  1. Create APIs to access data residing in mainframe using Mainframe Z/OS components like Z/OSMF, DB2 rest interface, IMS SOAP connect etc.
  2. Create APIs for all the existing IMS transactions such that front end applications can use APIs instead of MQ based tight integrations. For
    Example Zosconnect offers capability to generate API for all IMS defined transactions.
  3. Establish API gateways (Ex: Kong gateway) to manage all these APIs and micro services.
  4. Create CD/CI pipelines with automated testing scripts for mainframe micro services and APIs.
  5. Uplift old front end applications to new micro service based architecture which can run in docker containers.
  6. All these docker containers running front end applications should be provisioned and configured using Infrastructure as a service solution
    like Kubernetes.
  7. Mainframe side development should be uplifted from traditional Mainframe based test environments to X86 mainframe emulators(Ex: IBM
    zPDT ) which can scale dynamically.
  8. Mainframe test environments should be provisioned dynamically based on Application images using products like ZD&T on a
    infrastructure as a cloud platform rather than spending millions of dollars building new test environments and creating dedicated teams to
    maintain them.

Here we briefly discuss about creating APIs to access data residing on mainframe using ZOSMF , Db2 REST services available in mainframe.

Instructions for ZOSMF

  1. Make sure ZOSMF is installed and running on the mainframe LPAR your want to extract data from.
  2. Find the ZOSMF port from configuration files in /var/zosmf/configuration/servers OR from STDOUT of IZUSVR* jobs running in spool.
  3. Get access to ZOSMF by getting your mainframe id added into RACF profile linked to USS user group specified in ZOSMF configuration.
    (Check with administrators/Security team if you are not sure where to check)
  4. Install python or prepare Java spring boot environment
    • Refer below links for installing Java spring boot or python:
      • Java Springboot instructions
      • Python instructions
  5. Create API endpoints for accessing mainframe data by using python flask module or Java Springboot

Using python flask and requests modules to create API endpoints :

Once python is installed use pip to install flask and requests modules. Flask will be used to create local server with API endpoints on localhost ( requests module will be used to send GET/POST/PUT/DELETE requests Z/OSMF or DB2 Rest interface.
Below is the sample code to do this

Payload should be formatted in json as per mainframe api schema to which url is pointed to.
Authorization: If mainframe APIs are using basic authentication then create base64 encoded text of string “userid:password”

header = {
'Content-Type': "text/plain",
'X-IBM-Data-Type': "text",
'Authorization': "Basic cDczNzQasdfasdFyMDcxOQ==",
'Host': "hostname:1111",
response = requests.request("PUT", url, data=payload, headers=header, verify=False)

Instructions for ZOSMF Db2 rest interface:

If we want to get data from DB2 tables on mainframe using Db2 rest interface then fallow below steps and create a DB2 service to be called by
our Python code or Java spring boot code:


  • To create a service, issue an HTTP or HTTPS POST request through a REST client with the following URI:POST https://:/services/DB2ServiceManager
  • Set the HTTP header Accept and Content-Type fields to application/json for the request.
  • Specify the create service parameters using JSON format key/value pairs in the HTTP body for the request. The requestType , sqlStmt , collection , serviceName , description , and version JSON keys are case sensitive. General service create bind option keys are case insensitive. The requestType , sqlStmt , and serviceName parameters are required. To use the version create service parameter, REST service versioning must be enabled. Specify the create service HTTP body content:
"requestType": "createService",
"sqlStmt": "<sqlStatement>",
"collectionID": "<serviceCollectionID>",
"serviceName": "<serviceName>",
"description": "<serviceDescription>",
"version": "<version identifier>",
"<bindOption>": "<bindOptionValue>",
"<bindOption>": "<bindOptionValue>"


  • createService indicates that you request to create a new service.
  • <sqlStatement> is the SQL statement that you include in the new service. For each new service, you can embed a single CALL, DELETE, INSERT, SELECT, TRUNCATE, UPDATE, or WITH SQL statement. When you create the service, Db2 also binds a package that is used to invoke the service.
    • Db2 adds a new row in the user-defined SYSIBM.DSNSERVICE catalog table for the service and saves the bound package in the directory. Db2also sets the HOSTLANG column in the SYSIBM.SYSPACKAGE and SYSIBM.SYSPACKCOPY tables to ‘R’ to mark the package for the REST API.
  • <serviceCollectionID> is the collection identifier of the package that is associated with the new service. The serviceCollectionID property is optional. If you specify , Db2 names the package in the form of collectionID.serviceName . Otherwise, Db2 uses the default form of SYSIBMSERVICE. serviceName .
  • <serviceName> is the name of the new service that you create.
  • <serviceDescription> is a brief description of the new service. The serviceDescription property is optional. If provided, Db2 stores the value in the DESCRIPTION column of the SYSIBM.DSNSERVICE catalog table.
  • <version identifier> is an optional version identifier for the service being created. This is only valid if REST service versioning support is enabled.
  • <bindOption> is the option that you specify for binding the package that is associated with the new service. The bindOption property is optional. All bind options supported by the BIND SERVICE subcommand, except DESCRIPTION, NAME, SQLDDNAME, SQLENCODING, and VERSION, apply to the createService API. The createService API uses the serviceName , description , and version parameters, instead of the NAME, DESCRIPTION, and VERSION bind options of the BIND SERVICE subcommand. See BIND SERVICE (DSN) for supported bind options. See BIND and REBIND options for packages, plans, and services for details about the bind options.

Once the service is created, we can call the service using python requests module or java spring boot code retrieve data from db2 tables using a
predefined SQL query of service.

Containerizing services:

Once the APIs are created for accessing mainframe data using python or java spring boot, We can create containers to deploy them in production. Containerization will enable load balancing, capacity management using container management platforms like Kubernetes or Docker swarm. Docker images for containers can be built by writing docker file similar to below:

FROM alpine:latest #base image to start from apline is recommended for
small size
# Expose server port, In this case flask default port
RUN apk add python # Install python in container
RUN apk add py-pip # Install python pip
# set working directory
#install dependencis
RUN pip install flask
RUN pip install requests
# copy files to containerCOPY . /MF-APP
# start app
CMD ["python", ""]

For more information on how to write Dockerfiles, see the Docker user guide and the Dockerfile reference .
Once the above file is created issue command “docker build ./” this will build local image. To manage local images and publishing to a docker registry use a docker GUI like portainer .( )

Links and references:


Synergy for keyboard & mouse sharing across PC/Linux/Mac systems

Synergy is a nice software which can be used to share keyboard and mouse across systems such that we no longer need to unplug and plug the hardware to different systems.

This software is available on windows, Linux and Mac (Cross Platform). This can be used as alternative to expensive hardware KVM switches.

This software offers ssl encryption and clip board sharing too.

how ever I came across some issues with ssl on my laptop running Opensuse Leap 42.2 Linux. (dependency not found error). Also Raspberrypi 3 repos have old version which doesn’t show ssl option.

To get around these issues I recompiled code from the git repository on Linux (Opensuse Leap 42.2) and Raspberry pi 3 ( jessie – PIXEL).

Compiled distribution binaries can be downloaded from below links.

Raspberrypi 3 (.deb arm version) –!4xxzWA6L!Yj9yLbczCW1m90-je0F3RZP1-azyzgbymkorkpC4W38

Opensuse (.rpm X86_64 version)-!JpgUHBQR!BYmEXOILv0E07h4PcoQtibLoGJ2gEjBuzDlgT8f7oiM


Using bluetooth mouse in windows/linux dual boot machine with out re-pairing

I got a nice Logitech MX Master Bluetooth  mouse. When ever I boot into different OS on my dual boot laptop I used to loose Bluetooth connection. I came across this article in internet which explains how to avoid re-pairing Bluetooth mouse on dual boot machines.

It explains how to extract Bluetooth connections keys from windows registry using sysinternal psexec and then update Bluetooth configuration file under linux to use same connection keys.

Instead of doing tricky conversions mentioned in above link, you can note down decimal equivalent values from windows registry and use those values to update linux conf file.


I have to do below two updates to make it work

  1. I updated IdentityResolvingKey in addition to keys mentioned in above link
  2. Left EncSize as 16 instead of changing it to zero.

Tip: Instead of exporting and then doing tricky conversion of registry values do below steps

Issue command: psexec -s -i regedit

from there export folder “HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Services\BTHPORT\Parameters\Keys”

From Registry editor note down  decimal values of EDiv and Rand such that we don’t need to covert later.

To easily reformat IRK, LSK and LTK values into Linux format, use python interpreter as below:

  1. Open python interpreter and Assign key string to a variable like
    a = ‘dc,07,9f,29,1b,84,23,f1,be,fe,73,73,a4,c3,d8,c7″
  2. Now to get the reformatted value, use below python code
    a.replace(“,”,””).upper()– Good Luck



Adding Hourglass statement to JCL using REXX exec AGGDD

Using REXX exec AGGDD
HourGlass DD statements can be generated automatically using the provided
REXX Exec AGGDD. From the EDIT Command line, type:
after placing an A (after) or B (before) line command designating the desired
location for the generated HourGlass DD statement. Optionally, the desired date
can be passed on the command line as a parameter in the format CCYY-MM-DD.
Without any parameters, an ISPF panel prompting for the desired date is
To simply display the correct HourGlass DD statement without adding it to your
current EDIT session, enter from any ISPF Command Line:
The user is prompted for the date in CCYY-MM-DD format and the time in
dHHMM format.
—————— IBM HourGlass: Generate HourGlass DDCard ——————
Enter Desired Runtime Date (ccyy-mm-dd): 1997-12-06
Enter Desired Time: Plus/Minus (P/M): P HoursMinutes (hhmm): 0100
East/West (E/W)
Fixed Step Start (F)
Absolute Constant Time (A)
After pressing ENTER, the appropriate HourGlass DD statements are shown.
—————— IBM HourGlass: Generate HourGlass DDCard ——————
Desired Runtime Date (ccyy-mm-dd): 1997-12-06
DDcard to use: //HG097340 DD DUMMY
Desired Runtime Time (d hhmm): P 0100
DDcard to use: //HGP0100 DD DUMMY
Hit ENTER to Continue
Press PF3 to exit this dialog.

Auto turn on electric devices using RF sockets and raspberry pi

Yesterday I wrote a small script to turn on electric heater in my room when temperature is less than 18C.

Below is the bash script :

temp=`node -pe ‘JSON.parse(process.argv[1]).main.temp’ “$(curl -s\&APPID=XXXXXXXXXXXXXXXXXXXX\&units=metric)”`
echo “Current temperature “$temp
if [ $(echo “$temp < 18” | bc -l ) -gt 0 ];then
echo “Switching on Heater”
codesend XXXXXXXX

codesend is the binary which uses GPIO pins on raspberry pi to send RF signal through connected RF transmitter.  I copied source code from RFutils project and tweaked a bit as per my requirements.