Modernization on the IBM i

 

AO Foundation
What is a modern database?

Understand your existing application.
Talk to the users and the programmers
review error logs
review to-do list

You need to decide if modernization is right for you.

You could replace the entire application by purchasing a package
If your existing application lacks functionality

You could recreate your application with an in-house development project
Only if your existing application is inadequate, and no package solution is available

You could modernize your existing application
Learn what data-centric programming is (link) (MVC)
Model/View/Controller

Decide if modernization is right for you.

Rank and sequence what modernization is to you

DDS > DDL (SQL)
field reference file
I/O servers
event triggers
constraints
commitment control
User Interface

All new developments should adhere to the modern application definition. There is no reason to continue with program-centric development. Once you adopt a data-centric approach, you will never revert.

Adopt an as-you-go modernize approach.

Existing applications that are currently program-centric. Will require planning
A clear understanding of the end product is required.
Theory has gotten you this far, but to get the most out of your DB, you will need to know specific techniques and have an intimate understanding of how data is defined and moved around.
Choose the AO education course or download training videos.

How to go about DB modernization

Big Bang – Scary

I suggest one file at a time. Starting with a simple master file for learning purposes, then moving on to the critical files.

Reference file: The AO repository standardizes data definitions.
Create a definition for the Company (not the file)
Replace prefixes by using qualifications in the programs
DDS to DDL- AO Foundation converts DDS to DDL without creating level checks
By converting the DB to SQL, you can tap into the programmers who were taught SQL at school.
SQL Long names make code easier to understand.

I/O servers: improve agility to the database by consolidating file I/O into a single program, which results in fewer programs to compile when changing a File definition

Event triggers: Before triggers validate data before adding it to the table/file. Makes the DB independent of the program

After triggers help keep data up to date in real time by replacing weekend, month-end, and year-end processing.

Constraints – enforce the data model and are part of the operating system. Just monitor for the errors.

Commitment control: Only commits the updates when all files are updated.

User Interface: Users no longer want to see the green screen. Screen scrapers are fast. JavaScript or Python may mean acquiring new programming skills to implement

These brief descriptions of the various components of modernization are short. Please reach out to me to talk about it

. There are implications associated with each of these components, which are better handled in a phone call.

Field Resizing Conundrum IBM i

Database modernization

With a legacy database you will need to change every program that uses the file at the same time you change the file definition. You simply have no choice. Level checking forces you to do so.

With a modern database using I/O servers, you can minimize the effort required. (Terms & Conditions apply)

Conditions:

  1. 1. Add an unused field to the end of the existing file.
  2. 2. Replace all the file IO with an IO server that does exactly what the READ WRITE SETLL CHAIN functionality does. You still need to recompile every program that uses the file, but life gets better after introducing the I/O server. This can be accomplished with minimal programming changes.

The Secret Sauce: (parameters used to call the I/O server)         

  • FUNCTION    I-Insert  U-Update  D- Delete… 
  • POINTER       Pointer is provided by caller of I/O server
  • KEY.               Key field or fields (optional depending on function)

Why locate the record format with a pointer?

  1.  1. The parameter list will not have to change every time you add a field to the file.
  2.   2. Programs using the I/O server will only have to be recompiled when an existing field or the record length changes.

Why add a 20-byte field of unused space to the record?

  To reserve space in I/O server to accommodate adding fields to the file/table in the future without disturbing the running of existing programs. The only programs that need to be recompiled are the ones using the newly added fields. Programs not using the new field are not affected by the change and will continue to work without recompilation.

This investment in IO server will allow you to handle DB Changes with minimal disruption because only one program accessing the file. This also simplifies the administration of the file.

 

Adding a Field

  1. 1. Add the field/column to the file/table.
  2. 2. Be careful not to change the length of the row/ record length
  3. 3.  Compile the I/O server and the programs requiring the new field
  4. 4.  You don’t need to compile anything that is not using the new field.
  5. 5.  Copy data.

Resizing a field:

  1. Make Changes to the file/table.
  2. Copy data from the old format to the new format.
  3. Recompile all programs using I/O server & the I/O server

I/O servers go a long way to simplify the maintenance and administration of a file.

See our open-source page for examples of a modern DB. https://databoroughservices.com/rpg-open-source-project/

API’s vs Database Renovation

Database modernization

Functionality:

    • API: An API provides a set of protocols, tools, and definitions for building and integrating software applications. It defines how software components should interact, enabling different systems to communicate. (The bridge between the frontend and the application code)
    • Agile Data Centric Database: An agile database refers to a design approach that emphasizes flexibility and adaptability. It allows for iterative development, making it easier to achieve incremental improvements over time. By implementing event processing, constraints, and the necessary steps to achieve a 3rd normal form, you add agility to the database. 

In a modern application, they both perform essential and separate functions. This post will focus on the functionality that they could share. They both can be used to enforce business rules. An API in a program-centric application likely encapsulates business logic from pre-GUI (Graphical User Interface). In a data-centric world, a significant amount of the business logic can be built into the database and not the API.

Flexibility:

    • API: API’s provide flexibility in integrating different systems and services. They allow developers to access specific functionalities or data from one system to another..
    • Agile Data Centric Database: Agile databases provide flexibility by adapting to changing business needs and requirements. Depending on the level of agility achieved, they can accommodate changes in data structures, relationships, and business rules without requiring significant redesign or disruption.
    •  

The API provides the connection between the database and the GUI. The database contains and maintains the company’s most valuable asset. (DATA)

Development Process:

    • API: Developing an API involves defining endpoints, request-response formats, authentication mechanisms, and documentation. It typically follows a structured process and requires collaboration between teams responsible for building different system components.
    • Agile Data Centric Database: Agile database development follows agile principles, emphasizing iterative development, continuous feedback, and collaboration between developers, business stakeholders, and database administrators. Changes are made incrementally based on evolving requirements and user feedback.
    •  

It is likely faster to develop an API than it is to renovate your database. However, speed is not the only consideration. The ongoing application maintenance should be considered along with the mounting technical debt. When you create an API with business logic included, you must maintain those rules in 2 or more places. When you associate the business rules with a file, through an event trigger program, you consolidate the business logic in one place and reduce the maintenance effort in the future.

In the long term, the database of an IBM i application will need to evolve to meet these future demands, focusing on scalability, performance, security, integration, flexibility, analytics, cloud compatibility, and high availability.

In summary, API’s and agile databases potentially have overlapping purposes. They both play important roles in building modern software systems. APIs should facilitate communication and integration between software components. Agile databases support flexible and adaptive data management, including business logic, while accommodating changing business needs. Together, they enable the development of dynamic and responsive applications that can evolve and scale over time.

For the sake of agility, it is better to make a strategic decision to enforce business rules at the database level rather than in the API, to avoid the programming redundancy and expense of doing it more than once.

Making database changes, such as adding a field or changing a field size, can be intimidating, especially during normalization. I will include an example explicitly created for one of our customers to test and implement file normalization changes safely.

https://databoroughservices.com/2023/04/27/low-risk-file-normalization-modernization/ 

You asked for Modernization

Overview of components that make up DB2

You got a new user interface. Be careful how you throw around the term modernization. You thought a new user interface was all you needed. You forgot about the DB (database). The demands of a modern application may exceed the capabilities of your DB

To make what I’m about to suggest relevant let me share some coding practices  from the past

There was a time when we described files in programs. It was possible to describe a field as numeric in one program and character in another. Then we started using externally described files and it improved the quality of the data a lot.

Since the introduction of externally described files, there have been many improvements. They focus on data quality and the programming effort required to achieve optimum results.

Before event triggers: Offer an ironclad method of vetting every record/row added or updated in a file/table. (No more wondering about how those orphaned records got there) Business logic is associated with the add/ change/ delete function. Data quality is checked at the point of entry. Not even DFU or SQL can get around this.

After event triggers: can keep data up to date in real-time. Traditional end-of-day, week, month, and year processing can be performed as a result of successfully adding, updating, or deleting a record/row.

Constraints: save a lot of programming because they enforce the data relationships and prevent orphaned records. You let the DB management system check the constraints you define. You could continue to perform this check in the programs or simply let the operating system do it.

I/O (input/output) servers: are border control points for Security & rollback considerations. They simplify the administration of security and data usage.

AO Foundation (my solution) can assist you by generating programs that support these modern concepts

    By doing the UI first, you are increasing the technical debt because you will be forced to convert the business logic and the constraints to the language of the new UI and or use an API.  An API is a repackaging of the existing business logic, which means both versions of the business rules will need to be maintained when requirements change.

Side effects of renovating the database first.: Easier to replace the user interface.

Consolidation of the business rules in trigger programs improves agility, thus making it a more strategic solution while minimizing API requirements. Not to mention the cost associated with maintaining several versions of the business rules.

If you’re ready to modernize your application, take advantage of this special offer. Don’t let outdated coding practices hold you back. Upgrade your application today.

https://databoroughservices.com/special-offer/

IT Managers: Why are you spending resources on something the operating system does?

Database modernization

Specifically the database: If you’re not taking advantage of current technology I mean trigger programs, constraints & I/O servers, you’re building up technical debt and it’s going to cost you big time in the future.     

     The sooner you accept that a modern database will serve you better than using program centric techniques to do things that are handled by the operating system, the sooner you will realize the benefits.

The problem with enforcing data integrity in monolithic programs is it requires programming effort for something that can and should be done by the operating system. 

One problem with keeping the business logic in a monolithic program is a file may be added to and maintained by multiple programs. Which creates a need to keep the business logic in sync in multiple programs. By consolidating the business logic in one place (trigger program/ service programs) you negate the need to do so in the application programs. It is simply the best place to do the job

Time for a new User Interface: It doesn’t matter what language you’re going to use for the new UI. If you’re not using trigger programs and constraints you will be forced to convert the referential integrity logic and the business logic to the new language. This means new programming skills will be needed and old programming skills will not.

You also need to consider what future demands are going to be put on your data, in the form of quality and security. I’m saying it’s better to deal with the database before the user interface. You may find yourself in a situation where you are forced to redo the user interface to meet your data requirements.

   To see an example of a modern database using advanced RPGLE programming techniques you can download a savf with a small 5-file application from this open source page.

Kicking the can down the road

IBM i Database modernization

The legacy application dilemma.

The most common way to kick the can down the road is by modernizing the user interface without modernizing the database.

Another way to kick the can down the road is to modernize your database using surrogate logicals.

Reasons why the UI is chosen to be modernized before the database?

The boss says I want a modernized application. Most of all I want a new user interface. 

It’s easier to just address specific issues as opposed to taking a larger more strategic approach.

Failure to recognize the implications of doing the UI first or the benefits of the MVC architecture.

 Fear of risk

Strategic Argument

Doing the user interface first is counterproductive because you will be forced into converting the business logic from the legacy user interface to the modern browser language. That means new programming skills will be required and old programming skills will not. Secondly, it doesn’t get you any closer to a modern architecture MVC (Model View Controller).

In contrast, by moving the business logic from the legacy UI to trigger programs and taking advantage of DB2 constraints you will be making good use of existing RPG skills, as well as making the task of modernizing the UI simpler.

Surrogate logicals do not give you the full functionality of SQL.

Risk can be managed by running parallel databases. Simply add an after trigger to the original file, which will execute the new trigger program version of the same file (test library). Most importantly this allows for testing until satisfied.

Download a sample of a modernized 5 file application.

Check out a special offer to have one of your files modernized.

Sample Modern Database

This example contains several test programs used to test the modernized database. The test programs can be found in the source file SRCIOS  they can be modified to add records or update records. Remember that referential constraints are in place and validations are performed in the trigger programs. You need to be ILE literate to understand and work on these programs. Don’t be afraid to contact us for assistance. 

Download Sample

DB2

DB2 is a relational database management system (RDBMS) developed by IBM. It was first introduced in the 1980s and has since gone through many iterations and updates.

DB2 is designed to manage large amounts of structured data and provides a wide range of tools and features to ensure data integrity, reliability, and security. It supports SQL (Structured Query Language), which is a standard language for accessing and manipulating data in a relational database.

Some of the key features of DB2 include:

  • Scalability: DB2 is designed to handle large amounts of data and can scale up or down depending on your needs.
  • Security: DB2 provides a range of security features, including encryption, access controls, and auditing, to ensure that your data is protected.
  • Availability: DB2 is designed to be highly available, with features like automatic failover and backup and recovery options.
  • Compatibility: DB2 is compatible with a wide range of operating systems and platforms, including Windows, Linux, and UNIX.
  • Performance: DB2 is optimized for performance, with features like data compression and indexing to ensure that queries and transactions run quickly and efficiently.

Overall, DB2 is a powerful and versatile RDBMS that is widely used in enterprise environments for managing large amounts of structured data.

Data Scientist

A data scientist is a professional who applies scientific methods, statistical algorithms, and machine learning techniques to extract insights and knowledge from structured and unstructured data. In simpler terms, data scientists analyze and interpret complex data sets to uncover patterns, trends, and insights that can be used to inform business decisions, product development, and other applications.

Data scientists are skilled in data manipulation and have a deep understanding of programming languages, statistical modeling, and data visualization tools. They work with large amounts of data from various sources, including databases, social media platforms, and sensor networks, to extract meaningful insights.

Data scientists are essential in today’s data-driven world, where businesses and organizations require accurate insights to make informed decisions. They can work in various industries, including finance, healthcare, e-commerce, and marketing, among others.

To become a data scientist, one typically needs a strong foundation in mathematics, statistics, and computer science, along with experience in programming languages such as Python or R. Additionally, data scientists must possess critical thinking skills, attention to detail, and the ability to communicate complex data analysis in a clear and concise manner.

In summary, data scientists are highly skilled professionals who play a crucial role in interpreting and analyzing complex data sets to inform business decisions and drive innovation.

Data Analyst

A data analyst is a professional who is responsible for collecting, processing, and performing statistical analyses on large sets of data. They use various techniques to identify patterns and trends in data, which can be used to inform business decisions and strategies.

Data analysts are proficient in programming languages like Python and R and have expertise in using data analysis tools such as SQL, Excel, and Tableau. They work with data from various sources such as customer transactions, web traffic, and social media interactions, among others.

Their main responsibilities include identifying data trends, generating reports, creating data visualizations, and developing data-driven solutions to business problems. They also collaborate with other professionals like data scientists, software engineers, and business analysts to improve data quality and streamline data processes.

Overall, data analysts play a crucial role in helping organizations make data-driven decisions and achieve their business objectives.