As of December 31, 2024, Infor is set to transition all versions of the products mentioned below to the next stage of support, following the Product Lifecycle Policy for Infor OS On-Premises (Infor OS OP) products.

The affected products moving from Supplementary Update Maintenance to Sustaining Maintenance are:

  1. Infor OS
  2. Infor OS Lite

Reasons for the Change

This transition is prompted by the retirement of several underlying prerequisite components that Infor OS OP relies on, including Amazon Corretto® 8 (Java®), as well as versions of Microsoft Windows®, Microsoft SQL Server, Microsoft .NET, and other prerequisite technologies.

Impact on Users

Supplementary Update Maintenance for these products will continue until December 31, 2024. After this date, they will enter Sustaining Maintenance. During this phase, users will retain access to existing patches, documentation, and Knowledge Base articles. However, no new updates or fixes will be provided for these products.

It’s crucial to note that the third-party products required for Infor OS OP are also approaching the end of their support cycles. Infor strongly advises careful consideration of the use of these products, as continuing to use Infor OS OP beyond 2025 may expose your environment to security risks.

Available Alternatives

Infor is offering two alternatives for customers with any version of the affected Infor OS OP products:

  1. For customers active on support/maintenance: A no-cost license exchange to a new on-premises (OP) product called Local Technology Runtime (LTR).
  2. For all customers: Purchase access rights to a new multi-tenant (MT) Infor OS tier, which includes a coterminous OP license to the LTR product available in option 1.

Introduction to Infor LTR

Infor LTR will replicate the critical core services of the current Infor OS OP product, including Security, Ming.le, ION, Infor Document Management (IDM), and API Gateway. This will be achieved using new technical prerequisites committed to support until the end of the decade. The hybrid LTR option allows connection to MT Infor OS, extending the multi-tenant world to on-premises customers.

Installing LTR involves a straightforward technical upgrade, akin to applying the latest version of Infor OS OP. This means no major rip-and-replace—just a simple, in-place upgrade. The transition from old Infor OS OP to new LTR OP will be seamless for end-users and administrators, requiring no retraining.

While LTR OP will be updated to maintain compatibility and resolve defects, new innovations and updates will only occur in Infor OS MT.

Infor’s Commitment to Business Requirements

For customers seeking to exchange their current Infor OS OP for a new OP-only solution, Infor LTR provides the same core functionality with planned support until the end of the decade.

For those aiming to leverage the latest technologies for digital transformation alongside existing on-premises applications—a hybrid solution:

  1. Infor OS MT offers a more extensive and feature-rich set of services and capabilities.
  2. Infor LTR OP is included, opening the door to unique hybrid deployment options.
  3. Combining MT and LTR unlocks new functionality and possibilities, growing over time with Infor OS MT releases.
  4. The Infor OS MT license option serves as a launchpad for digital transformation services such as Augmented Intelligence with Machine Learning (AI/ML), Enterprise Automation with Robotic Process Automation (RPA), and other add-on technology services exclusive to the MT world.

These options grant flexibility to run core technology services on-premises, in the multi-tenant cloud, or a mix of both, according to your requirements.

Next Steps

Infor strongly recommends initiating the transition planning process by reaching out to your Infor representative or submitting a support ticket. Regardless of the chosen path, Infor is committed to assisting you in developing the most effective strategy, ensuring you have the necessary products, capabilities, capacity, and licenses to meet your requirements.

 

These days, many enterprise resource planning (ERP) platforms have found in one way or another to incorporate artificial intelligence (AI) in their business systems. AI could assist in analyzing patterns and forecast with great accuracy both customer demand and optimal business efficiency. Scott Hebert Forbes Council Member and Americas Chief Revenue Officer for SYSPRO, shares an article on what AI really means to ERP and the effects of generative AI being thrown in the mix. He states,

For ERP, generative AI can enable end users to communicate with the platform using ordinary language, either typed or spoken, and the software will reply in a similar fashion. Let’s say you want to get a report on which key raw materials the company is in danger of running short in the next three months. Simply ask, “Produce a report on the key raw materials of which we are at the highest risk of running short in the next three months.” You could follow up by asking the platform to email you a new report on the first of each month. It can be that simple. In the future, I see decision-makers being able to directly ask ERP for recommendations on what actions to take. If the generative AI-powered ERP says that the company will likely run out of component X in two months because their key supplier is experiencing difficulties due to labor shortages, the user could ask, “What actions do you recommend the company take to prevent this shortage?” In this way, the ERP becomes a key advisor to help solve business challenges.

Herbert also notes that generative AI has the ability to organize, summarize and index enormous volumes of unstructured data, cutting the time and costs it would usually take a team to do the same task. He states that for ERP users, this capability could come in handy when attempting to understand long, complex purchase orders for, say, a shipyard that needs highly specialized fabricated metal for a new boat.

 

For Full Article, Click Here

Infor recently unveiled its first major update for 2024 and Steve Brooks, joint Editor at Enterprise Times and Senior analyst at Synonym Advisory, shares an article highlighting the latest from the tech giant.

Infor Gen AI. “Another big announcement with this release is the emergence of generative AI capabilities. Eamonn Ida, Head of Solution Marketing, Infor OS Platform, said, ‘We’re very excited to announce Infor Gen AI, all new generative AI capabilities that are being developed across Infor CloudSuites that will further enhance our industry AI solutions.’ These new developments are underpinned by three tenets: A hyperfocus on industry value creation, The democratization of AI benefits throughout organizations, and Maintaining data privacy and security.”

Infor ESG Reporting. “ESG reporting will initially be available for customers in process, distribution, fashion, and discrete manufacturing. The new reporting solution includes dashboards and built-in templates to support common ESG reporting frameworks such as the Global Reporting Initiative (GRI) and European Sustainability Reporting Standards (ESRS). The templates are automatically populated using data held within Infor ERP, spreadsheets, and other applications. The solution will enable organizations to deliver improved accuracy, transparency, and consistency in their ESG reporting. It simplifies data collection and enables users to identify the root causes of omissions or anomalies.”

Infor Value+. “Infor Value+ accelerators are broken down into three categories: 1) Value+ Automation Flow: These aim to automate common processes across organizations that are current pain points. 2) Value+ Insights: AI analyses data to surface insights and actions that will help organizations highlight revenue opportunities and improve business KPI. 3) Value+ Advanced Workspaces: These surface business intelligence and key information that people need to do their jobs within their function and industry.

Additional industry updates were announced among Infor’s current platforms, including Infor M3, Infor LN, Infor Syteline, Infor Financials, Infor HCM, Infor Workforce Management, and Infor MES.

 

For Full Article, Click Here

In 2024, the release schedule for Infor Lawson V10 products deployed on-premises and on single-tenant cloud platforms will undergo a change. Instead of the March and September releases observed in 2023, these products will now follow a schedule with releases in April and October.

 

Additionally, Lawson Enterprise Applications Maintenance Service Packages (MSP) will be provided annually, synchronizing with the October release.

 

Critical bug fixes for products will still be promptly delivered as needed, encompassing security patches and updates addressing date-sensitive statutory and regulatory compliance requirements.

 

If you have any questions about the new update schedule, reach out to your managed service provider, or your Infor Account Team.

 

Great news!  Nogalis provides Managed Services for all your Infor Lawson products.  Find out more here!

 

Artificial intelligence (AI) and machine learning (ML) have changed the automation landscape and the fundamentals to IT. Antony Adshead, Storage Editor at Computer Weekly, shares a post detailing storage technology and AI’s effects on data storage. Storage becomes a key part of AI, to supply data for training and store the potentially huge volumes of data generated, or during inference when the results of AI are applied to real-world workloads, Adshead notes.

What are the key features of AI workloads? “There are three key phases and deployment types to AI workloads:

  1. Training, where recognition is worked into the algorithm from the AI model dataset, with varying degrees of human supervision;
  2. Inference, during which the patterns identified in the training phase are put to work, either in standalone AI deployments and/or;
  3. Deployment of AI to an application or sets of applications.”
What are the I/O characteristics of AI workloads? “Training and inferencing in AI workloads usually requires massively parallel processing, using graphics processing units (GPUs) or similar hardware that offload processing from central processing units (CPUs). Processing performance needs to be exceptional to handle AI training and inference in a reasonable timeframe and with as many iterations as possible to maximize quality. Infrastructure also potentially needs to be able to scale massively to handle very large training datasets and outputs from training and inference. It also requires speed of I/O between storage and processing, and potentially also to be able to manage portability of data between locations to enable the most efficient processing.”

What kind of storage do AI workloads need? “The task of storage is to supply those GPUs as quickly as possible to ensure these very costly hardware items are used optimally. More often than not, that means flash storage for low latency in I/O. Capacity required will vary according to the scale of workloads and the likely scale of the results of AI processing, but hundreds of terabytes, even petabytes, is likely.

Storage for AI projects will range from that which provides very high performance during training and inference to various forms of longer-term retention because it won’t always be clear at the outset of an AI project what data will be useful.”

Is cloud storage good for AI workloads? “Cloud storage could be a viable consideration for AI workload data. The advantage of holding data in the cloud brings an element of portability, with data able to be “moved” nearer to its processing location. Many AI projects start in the cloud because you can use the GPUs for the time you need them. The cloud is not cheap, but to deploy hardware on-premise, you need to have committed to a production project before it is justified. “

 

For Full Article, Click Here

If you have services for approval or other types of flows in IPA, and they stop getting triggered after a Landmark update, it is possible that you need to repackage and redeploy the LPS jars.  This is good practice after a Landmark CU anyway.

  1. On the Landmark server, open a Landmark command window
  2. Run the command “packageLPSClientJars”
  3. Copy the LASYSDIR/LPS/LPSClientJars.jar file to the Lawson server at GENDIR/bpm/jar
  4. Run commands stoppfem and stoppfrmi
  5. Navigate to GENDIR/bpm/jar in a command window with environment variables set
  6. Run tar -xvf LPSClientJars.jar
  7. Run commands startpffrmi and startpfem
  8. If the changes don’t take effect, reboot the Lawson server

 

When compared to on-premise environments, the benefits of being on the cloud is exponentially better. Paul Wagenseil, custom content strategist for CyberRisk Alliance, shares an informative article on their website on the impacts on cloud migration and the modern network security platform. In April 2024, the CyberRisk Alliance (CRA) Business Intelligence surveyed 202 security and IT managers, executives and practitioners and found that in the previous 12 months, 93% of respondents migrated some share of their workloads to the cloud. Forty-two percent said more than half their workloads were cloud-based, while 16% said more than three-quarters were. Likewise, in Check Point’s 2022 Cloud Security Report, 98% of respondents said their organizations used “some form of cloud-based infrastructure,” and 76% used more than one cloud service provider (CSP), including private cloud deployments. One surveyor remarked, “The cloud gives us scalability. If we need a new server, we can spin that up in minutes rather than waiting on equipment purchase for on-prem. It lets us focus more on application support … rather than focusing on worrying about infrastructure.”

However, with any new technology comes risks. Securing a cloud-based or hybrid network with both cloud and on-premise elements is different from securing a fully on-premise network. Wagenseil notes that assets, application servers, and databases are often scattered among different cloud instances, or between cloud and on-prem servers, sometimes even with the same asset sharing space in multiple environments. “Network-security practitioners can no longer draw a ring around a core group of assets and declare that they are protected,” he says. “Instead, security tools and personnel have to follow each asset, each set of data and each user and create protections around them individually. This can lead to a radically different concept of network topology and security and requires drastic retraining of security personnel.” The biggest risk is often human error and misunderstanding on how the cloud network functions. “Misconfigurations are among the top risks facing cloud users,” says Wagenseil. “Check Point’s 2022 Cloud Security Report found that, for 33% of organizations, the complexity of their cloud environments makes it challenging to rapidly identify and correct misconfigurations before they can be exploited by an attacker. Likewise, CRA’s 2024 report put misconfiguration vulnerabilities at the top of the list of common cloud security-related incidents, with 35% of respondents citing such an incident in the past year.” Organizations should dedicate their efforts to fully learning their new cloud environments, and seek multiple resources such as written guides or IT partners. This way they can completely grasp their cloud networks and not need to worry about messing up their network security.

 

For Full Article, Click Here

Infor recently announced the launch of Infor GenAI and ESG Reporting to help customers improve productivity and report on their environmental footprint. Per the press release, Infor’s modern solutions, from industry-leading ERPs to supply chain solutions, run critical operations in manufacturing, distribution, healthcare and public sector, and the power of GenAI combined with unique industry capabilities and insights give customers immediate power to leverage the right data and workflows to help realize more value more quickly. Additionally, the solutions adhere to stringent security and data privacy best practices driven by Infor’s OS platform. Soma Somasundaram, Infor’s President and Chief Technology Officer, comments, “We’re constantly listening to customers’ pain points and anticipating opportunities that allow customers to be successful and competitively advantaged. When it comes to building solutions that are hyper-productive, our vision is to minimize the time customers spend in applications so that they can maximize their time focusing on their specialized work and creating value. Achieving this requires more than just humans and artificial intelligence working together. It also requires the right, understanding of industry subsegments and individual workflows. As a company built around industry specificity, Infor understands the context of how people work so that we can build those best practices into our products to help enhance productivity. Infor GenAI is designed with this in mind.” Further, Infor ESG Reporting helps customers address the increased pressure and changing requirements from internal and external stakeholders to report on their environmental footprint. With Infor ESG Reporting, customers have the data and analytics to help drive more meaningful, measurable, and actionable change, while reporting on their environmental footprint with improved accuracy, transparency, and consistency.

 

For Full Article, Click Here

Are you getting this error when updating a user’s record in ISS stating

loadResource(): returned identity [User:] already exists?

 

If so, then this error is due to missing extended attributes in LDAP. The extended attributes are what allow ISS to know that the user exists in LMK and are created when a user is added via ISS or the user is included in a sync.

Simply run a full sync via the ISS webpage to create the extended attributes

 

To prevent this issue going forward you should add a list-based sync to the user add process. This should resolve the error completely.

 

Tech expert Brian Sommer shares an article on Diginomica highlighting Infor’s product update presentation earlier this March. The almost 2-day event at their New York HQ covered many innovations that the enterprise resource planning (ERP) vendor has been working on in recent years. Sommer provides key takeaways below:

 

For Full Article, Click Here