So, you have a list of users with several LBI bursting rights. Manually doing this is common but prone to mistakes while also timely.  This is where loading a file into LBI is more efficient.

 

  1. First let’s build a template file in excel (this can be used over and over again in the future).
    1. You’ll need to know how structures are setup in your LBI system since this will vary
  2. In this example we have an ACCOUNTING UNIT structure that is equal to a number value, ours is 4 digits.
  3. I will explain each column in order of your template file

    1. Action mode, A for Add, C for change, D for delete.
    2. Username
    3. Username
    4. Structure Name (ACOUNTING UNIT in our example)
    5. Structure Sub-name (ACCOUNTING UNIT under Group1 in our example)
    6. Condition (equal to, greater than etc.)
    7. Assigned value to structure
    8. Used to create a 2nd column after column G
    9. Represents element group (multiple would create sub groups within same structure)
    10. Represents multiple element fields within a structure group (see below example)
    11. Owner – typically Lawson or another admin user
    12. Start Date (how far back could the user access older reports)
    13. End Date (how far in the future could user access new reports)
  4. Now once you create your load file, go into LBI Reporting Services Administration and go to Import Rights
  5. Browse and select your CSV template file
  6. Click Validate Only to see any returned errors
  7. If contents are valid, add file again and this time click the Validate and Load button

Go to maintain rights and validate your user was loaded with proper data. Good luck!

 

Many organizations opt to engage Lawson consultant teams for managing their Lawson Business Intelligence (LBI) system. These consultant teams offer managed services at a fixed monthly rate and possess extensive knowledge and expertise in managing LBI. This service is particularly suitable for larger organizations, but smaller organizations that do not require a full-time Lawson employee on-site may also find it beneficial. Nogalis provides this service, and you can contact us via our contact page for further information.

Here are steps to follow if you receive the error message “Registration failed with exception when trying to register a new federated system.

If you receive the below message when trying to register a federated system, open the lsservice.properties file on both servers.  Either add or update the line server.keystore.use.classic=false.

Tue Feb 21 19:26:45.573 EST 2023 – default-1240412896 – L(2) : Registration failed with exception.  Details: registerServer() received Lawson Security Error: Please check log files for details

Error happened on server.com;40000;40001;LSS.

Unable to reach the specified server [server.com;9888;10888;LANDMARK]. It will not be registered.

Stack Trace :

com.lawson.security.interfaces.GeneralLawsonSecurityException: Unable to reach the specified server [server.com;9888;10888;LANDMARK]. It will not be registered.

at com.lawson.security.server.events.ServerServerFederationEvent.processRegisterServer(ServerServerFederationEvent.java:994)

at com.lawson.security.server.events.ServerServerFederationEvent.process(ServerServerFederationEvent.java:115)

at com.lawson.lawsec.server.SecurityEventHandler.processEvent(SecurityEventHandler.java:634)

at com.lawson.lawsec.server.SecurityEventHandler.run(SecurityEventHandler.java:377)

If the IBM HTTP Server for my Web Server logs become too large to open and take up too much disk space, configure the Web Server to roll the logs by day and size.

 

Steps to perform:

IBM HTTP Server has many logs in the Folder “<Installation_Directory>/IBM/HTTPServer/logs”.  You can customize those log files such as the following logs in IBM HTTP Server:

  • Admin Log: admin_access.log
  • Admin Error Log: admin_error.log
  • Access Log: access_log
  • Error Log: error_log

 

  1. Go to the location of your IBM HTTPServer installation ($IHS_HOME or <Installation_DIR>/IBMHTTPServer).
  2. Change to the “conf” directory and open the httpd.conf file.
  3. Locate the line: CustomLog log/access_log common.
  4. Comment out that line, and after it add this line:

 

Change:

CustomLog “|/opt/IBM/HTTPServer/bin/rotatelog -l /opt/IBM/HTTPServer/log/access_log.%Y.%m.%d 5M” common

 

To:

#CustomLog log/access_log common

CustomLog “|/opt/IBM/HTTPServer/bin/rotatelog -l /opt/IBM/HTTPServer/log/access_log.%Y.%m.%d 5M” common

 

  1. Locate the Line: ErrorLog log/error_log.
  2. Comment out that line, and after it add this line:

 

Change:

ErrorLog “|/opt/IBM/HTTPServer/bin/rotatelog -l /opt/IBM/HTTPServer/log/error_log.%Y.%m.%d 5M”

 

To:

# ErrorLog log/error_log

ErrorLog “|/opt/IBM/HTTPServer/bin/rotatelog -l /opt/IBM/HTTPServer/log/error_log.%Y.%m.%d 5M”

 

  1. Then restart IBM HTTPServer.

 

Review the logs in the “<Installation_Directory>/IBM/HTTPServer/logs” directory to see the access log is logging by the Current date.

 

At times, you may get the following errors rebuilding the search index for Infor Security Services (ISS):

JVMDUMP039I Processing dump event “systhrow”, detail “java\lang\OutOfMemoryError” at 2023/03/25 19:35:53 – please wait.

 

To resolve these errors, do the following:

Adjust the JVM max memory size for ssoconfig in the GENDIR\java\command\ssoconfig.properties to 4096m

Before: ljx.vm.options=-Xmx512m

After: ljx.vm.options=-Xmx4096m

Next, save and close the file

This change is dynamic and does not require a restart, but you must exit ssoconfig for it to take effect.

After the change is completed and you’ve exited from the ssoconfig menu, you can now go back into ssoconfig -c  to choose the rebuild the search index.

This article shows you a step by step guide on how to resolve the following Lawson error:

“loadResource(): returned identity [User:] already exists

We need to run a list-based sync but first let’s explain what it is.

 

How list-based sync works:
Initially, an XML file containing all the records to be synchronized is required for executing the procedure. Later, the ssoconfig utility is employed to import the file. As a result of the sync command, all the relevant data related to actors/users, identities, and their associated roles (and services if required) is transferred from the “original system” (the one with updated data) to the “target system” (the one in need of an update).

 

Note: Please keep in mind that when synchronizing, if a service is present on the original system but not on the target system, it will be generated on the target system. It is important to note that syncing of services is a one-time process.

By default, during synchronization, Actor Roles are included with other user information and combined. For instance, if an Actor record in the LSF has one role “NormalUser,” and the same Actor is present in Landmark with two roles “NormalUser” and “BenefitsClerk,” then after synchronization, both systems would display “NormalUser” and “BenefitsClerk” for that actor. However, if you prefer to prevent the merging of Actor Role attributes, you can override the default setting in the input file. In this case, you will be asked to decide between Update Local or Update Remote action for each difference, similar to other attributes.

 

Create the XML input file (default behavior for Actor Roles)

Prepare an XML file in the suitable format for importing. The following example depicts the standard approach, where Actor Roles are combined during synchronization.

 

Example File:

XML input file in which you specify behavior of Actor Roles

This instance contains the “mergeActorRoles” attribute, which is present in the file and set to “false.” Consequently, the roles will not be merged, and the specified action in the XML file will be followed instead.

NOTE:  If you require the merging of roles for some Actor records, you should adjust the “mergeActorRoles” attribute to “true.”

 

Example file:

Run the import command using the ssoconfig utility

  1. Execute the import command using the ssoconfig utility. Open a command window and enter the following command:

ssoconfig -S PWD <full_path_and>InputFileName

 

In this command, replace PWD with the security administrative password and <full_path_and>InputFileName with the name of the input file that you created in the previous step.

Once entered, a notification will confirm that the synchronization process has begun.

  1. To monitor the synchronization progress, navigate to the Check Background Sync Status option within the Manage Federation sub-menu.

Validate results.

Introduction:
In today’s fast-paced business environment, companies often undergo digital transformations, migrating from legacy Enterprise Resource Planning (ERP) systems to modern solutions. However, legacy ERP applications often hold valuable historical data that must be preserved for regulatory compliance, historical analysis, and potential future use. Application archiving projects often fall under the responsibilities of IT Application managers at most companies. In this article, we will explore how to effectively archive data for long-term storage, the methods for storing archived data, and data retention strategies that align with the unique needs of large-scale enterprises.

 

How to Archive Data for Long-Term Storage:
Archiving data for long-term storage involves preserving valuable historical records while ensuring data integrity, security, and accessibility. IT Application Managers should consider the following best practices:
a. Data Segmentation: Identify the data that needs to be archived, separating it from the active dataset. ERP applications like Infor Lawson, Oracle, SAP, or PeopleSoft may contain vast amounts of historical data, and careful segmentation ensures the right data is archived.b. Data Validation: Before archiving, ensure data accuracy and completeness. Running data validation checks will help identify and rectify any inconsistencies.c. Compression and Encryption: Compressing and encrypting the archived data optimizes storage space and enhances security, protecting sensitive information from unauthorized access.
d. User access: Perhaps the most critical component of any viable data archive solution, is how accessible it is by users. The right solution should enable users to access archived records without the need to involve IT.

e. Role based security: Of course with ease of access comes responsibility. The right solution needs to enforce well defined security roles to ensure that users are accessing the data they have access to and nothing more.

Methods for Storing Archived Data:
IT Application Managers have various options for storing archived ERP data, including:
a. On-Premises Storage: Keeping archived data on local servers allows for complete control over the storage environment, ensuring immediate access to data when needed. Of course on-premises storage can be costly due to infrastructure investments and ongoing maintenance. Additionally, it may face limitations in disaster recovery options and data accessibility for remote employees, potentially hindering seamless operations in geographically dispersed organizations.

b. Cloud-Based Storage: Utilizing cloud computing platforms for data archiving offers scalability, cost-effectiveness, and high availability. Cloud providers like AWS, Azure, or Google Cloud offer secure and reliable solutions for long-term data retention. However storing structured data in the cloud without a structured presentation layer falls short of meeting all of the requirements we previously stated.

c. Cloud-Based Application: Combining the benefits of cloud computing with a well thought-out presentation layer is the ultimate way to address the challenges of ERP data retention. This option provides the freedom to decommission and eliminate on-premise servers while maintaining data integrity and providing users an easy way to continue accessing the data in the cloud.

Data Retention Strategies:
Data retention strategies aim to define the retention period for archived data, ensuring compliance with industry regulations and business needs. IT Application Managers should consider the following approaches:
a. Legal and Regulatory Requirements: Compliance with industry-specific regulations, such as HIPAA, GDPR, or SOX, requires setting appropriate data retention periods to avoid penalties and legal issues.

b. Business Needs: Align data retention policies with the company’s specific business requirements and operational workflows.

c. User Stories: Understanding the needs of your subject matter experts is key to archival success. The SMEs understand the real data needs of the business, audit requirements, and what information needs to be readily accessible.

 

Conclusion:
Archiving old ERP applications is a crucial responsibility for IT Application Managers in large-scale enterprises with extensive data centers and cloud computing operations. By understanding how to archive data effectively, selecting a suitable solution, and implementing data retention strategies, organizations can preserve valuable historical information securely and compliantly. As companies continue to evolve and modernize, the archiving process becomes a strategic investment in the long-term success of their operations. For IT Application Managers seeking a comprehensive and reliable archiving solution, APIX offers a user-friendly, cost-effective, and secure data archiving platform tailored to the unique needs of large enterprises. To learn more, visit https://www.nogalis.com/lawson-data-archive/

This article covers the steps for configuring Lawson to automatically generate all file types (XML, etc) for each batch job that is run.

 

In LAWDIR/system, you will need to look for a rpt.cfg file.  If the file doesn’t exist, then you must create it.

Next, to generate XML for all batch jobs, you must set RUNTIME_XML to ON.  For CSV, the configuration type is RUNTIME_CSV.

See below for a sample file:

lawdir/system/rpt.cfg

There is no restart or downtime required for these changes to take effect. Simply restart the system and your new changes will be applied to your batch jobs moving forward.