Problem:           

Patch being installed if failing when running lawappinstall activate. It fails when running ujobload.

10/31/2023 5:44:25 Executing ujobdump.

10/31/2023 5:44:25 ujobdump execution successful.

10/31/2023 5:44:25 Executing ujobload.

10/31/2023 5:44:26 ERROR – ujobload failed.

ujobload via lawappinstall activate *** No jobs found to load When run manually, it fails with Segmentation Fault(coredump:

 

Resolution:     

lawappinstall update will stage tokens potentially needing a ujobdump/ujobload in LAWDIR/productline/backup/ACTIVATEstage/JOBconversion.

If conditions are correct, lawappinstall activate will run ujobdump and ujobload, then clean up the staged area.

  1. ujobdump -d LAWDIR/productline/backup/ACTIVATEstage/JOBconversion productline $LAWDIR/productline/backup/ACTIVATEstage/JOBconversion.dmp -t <list of Tokens>

In the above <list of Tokens> would be a space-separated list of the tokens located in $LAWDIR/productline/backup/ACTIVATEstage/JOBconversion/??src directories.

  1. ujobload -ou productline LAWDIR/productline/backup/ACTIVATEstage/JOBconversion.dmp
  2. remove dump file, LAWDIR/productline/backup/ACTIVATEstage/JOBconversion.dmp
  3. remove stage dir, LAWDIR/productline/backup/ACTIVATEstage/JOBconversion

Run steps a through d manually, then rerun lawappinstall activate.   If the ujobload fails in step b with a Segmentation Fault(coredump) or other error, make sure the user running ujobload has write access to these files and their directories.

LAWDIR/UJobLoadDir/productline/Tokens

LAWDIR/productline/UJobLoadDir/LDLog

Make corrections if necessary, run steps b through d above, and rerun lawappinstall activate.

Many businesses worldwide are beginning to allow more artificial intelligence (AI) in their business systems. In the Philippines, experimenting with AI is taking time but promises growth and increased productivity for businesses there. Technology Content Writer Dawn Solano shares an article on PhilSTAR Tech of an interview with Infor CRO Kevin Samuelson and his thoughts on AI’s impact in the Philippine’s business landscape.  “AI, being the ‘next frontier for technology’ will encourage creativity that Filipino businesses and businesses in most countries should explore,” he said. In addition, he told PhilSTAR Tech “Those that are experimenting and learning will be those that end up ultimately benefiting the most.” Infor has used AI in their cloud software services for years and had benefitted both themselves and their clients. With their knowledge of AI in the tech industry, this has allowed the tech giant to apply more advanced methods for their clients, Solano notes from the interview. Additionally, when asked how would legacy businesses in the Philippines, that are deep into technology debt approach AI, Samuelson understood that it could be overwhelming yet not totally requiring deep technical expertise. Moreover, he states that there are a lot of third-party agencies that could help companies put ‘up to speed.’ “These hyperscalers have built this amazing amount of compute, put the algorithms in place,” Samuelson notes. “And so I think the evolution of the technology has also made it far easier to adopt.”

 

For Full Article, Click Here

One of the latest competitive advantages in today’s business digital landscape is the use of artificial intelligence (AI) and cloud computing efficiency. When done right, AI and cloud computing help drive business growth,  productivity and security without sacrificing costs or resource use. From innovative AI management tools to cutting-edge cloud optimization strategies, Forbes Technology Council members share some best practices below to help your business maximize performance and minimize costs.

  1. Remove Silos To Prepare For A Quantum Future – “A quantum future is approaching, and leaders need to prepare now. It has the potential to drive massive productivity on its own, but when paired with AI, it can create a powerhouse of efficiency and security. Leaders need to start removing silos now to inform all teams about how strategic adoption can drive long-term success. – William Briggs, Deloitte Consulting”
  2. Utilize AIOps To Predict Potential Issues –  “One holistic approach to enhance AI and cloud computing efficiency is automation through AIOps. Automation can predict potential issues before they impact business operations through data analysis, streamlined workflows and reduced risk of human error. As a monitoring and predictive capability, automation prevents downtime, optimizes resource allocation and enhances security and compliance. – Akhilesh Tripathi, Digitate”
  3. Optimize Efficiency And Minimize Resource Duplication With Containerization – “Containerization is one process that optimizes resource efficiency by isolating applications while sharing the same operating system. This process also minimizes the resource duplication and reduces the cloud costs. By using this, companies can test their AI models without wasting their valuable resources. – Asad Khan, LambdaTest Inc.”
  4. Reduce Waste And Costs With Serverless Computing Architectures – “One effective practice is adopting serverless computing architectures. This approach allows companies to run applications and services without managing servers, which optimizes AI workloads by dynamically allocating resources based on demand, thereby reducing waste and costs. – Savitri Sagar, Kenzo Infotech”
  5. Emphasize The Importance Of Data Quality – “One best practice for enhancing AI is implementing data-driven reports with full traceability, educating all employees on the importance of data quality and fostering a data-first culture over time. This approach ensures data sufficiency for effective AI usage, transforming organizational decision-making and enhancing strategic alignment and efficiency. – Ged Ossman, Interf”
  6. Look Toward The Future Of Space-Based Technologies – “In the future, space-based data centers will be used for cloud computing for terrestrial applications. These will be more efficient than earth-based data centers as they will be able to use passive radiative cooling and efficient laser-based networking, and can theoretically co-locate far more computing power than the terrestrial electrical grid allows. – Ezra Feilden, Lumen Orbit, Inc.”
  7. Explore Neuromorphic Computing For Complex AI Tasks – “Neuromorphic computing is an emerging technology that mimics the human brain’s structure and function, allowing for more efficient AI processing with lower energy consumption. Neuromorphic chips can help companies perform complex AI tasks at the edge, speeding up data processing and reducing cloud computing demands. – Sriram Panyam, DagKnows”
  8. Build In Escape Hatches For Common Workflows – “Companies should consider building in “escape hatches” for common workflows. For example, if you deploy an internal AI chatbot for HR questions, you could anticipate the most common requests (e.g., request a pay stub or tax docs), and rather than have an AI decide what to do, direct the user to the process you have for pay stubs and tax docs. – James Ding, DraftWise”
  9. Leverage Unified Solutions Over Point Solution-Based AI – “Like past technological waves, AI is setting up to be a race to the middle. That has played out with the cloud, and now we get less from each cloud investment, as people are overwhelmed by how much we’ve thrown at them. Leveraging unified versus point solution-based AI that sits on top of our cloud software can improve the efficiency of our cloud investments and ensure we get durable differentiation from AI. – Michael Haske, Krista.ai”
  10. Develop A Well-Defined Data Catalog – “A well-defined data catalog with rich metadata and semantics is crucial for spearheading GenAI-based solutions in companies. A data catalog enables more sophisticated data analytics and AI capabilities. It helps in automating the data preparation process and supports more advanced techniques like semantic search, which can be leveraged to build more powerful and effective GenAI solutions. – Faisal Fareed, Amazon Web Services”
  11. Use Smaller Language Models For A Lower Carbon Footprint – “Smaller language models are more efficient, secure and able to scale economically with a lower carbon footprint. These closed models can be pre-trained on specific domains with AI workflows for the most optimized business processes while bringing a curated collection of information specific to your enterprise. – Venky Yerrapotu, 4CRisk.ai”
  12. Segment AI Applications And Workloads Based On Cost – “AI is a diverse technology with many applications that will entail many different deployment options that will be constantly changing as edge AI and cloud AI continue to evolve. Tech leaders should segment AI applications and workloads based on the cost (which includes IT or CSP’s energy cost) and determine where they are best run on-device, across edge infrastructure and cloud. – Leonard Lee, neXt Curve”
  13. Consider Distributed Computing – “Distributed computing is certainly one of the key emerging technologies companies should consider to improve efficiency in their AI and cloud computing resource consumption. AI, especially generative AI and advanced AI research, can be incredibly computing-intensive. By applying advanced distributed computing techniques, businesses can orchestrate AI tasks to appropriate resources and reduce costs. – Humberto Farias, Concepta”
  14. Look Into Federated Learning To Reduce Centralized Storage Needs – “One emerging technology to improve AI and cloud computing efficiency is federated learning. It enables models to be trained across multiple decentralized devices or servers, reducing the need for centralized data storage and massive computational power. This approach not only enhances data security but also optimizes resource usage by leveraging the computing power of edge devices. – Jiahao Sun, FLock.io”
  15. Consider LLMOps To Optimize LLM Use – “Large language model operations (LLMOps) is an emerging category that can help provide for more efficient and optimized use of LLMs. Running these models can be expensive, which can make it challenging to achieve sufficient ROI as well as scale. LLMOps are essentially a framework that focuses on efficient resource allocation, tracking models, evaluating responses and improving inference. – Muddu Sudhakar, Aisera”
  16. Educate Employees On The Nuances Of Proper Prompts – “Education on prompt engineering best practices is the most important way to improve companies’ efficient use of AI. AI is rapidly evolving, but in today’s environment, chatbot interfaces and the prompts that are used to get valuable output from today’s LLMs are critical to deriving value. Most employees treat them like search engines, do not understand the nuances of prompts and need training. – Michael Keithley, United Talent Agency”
  17. Focus On The Fundamentals – “Zeroing in on the fundamentals of your business can be a powerful way to optimize resource consumption. By revisiting basic principles like clearly defining project goals, streamlining data pipelines and right-sizing cloud resources, you can ensure your AI and cloud investments are truly delivering value, facilitating efficiency gains and allowing teams to do more with less. – Todd Fisher, CallTrackingMetrics”
  18. Leverage Computing Power From Endpoint Devices – “Not everything needs to be done in the cloud. There are a lot of computing resources sitting idle at every organization, every single day. Leveraging the computing power from endpoint devices brings back distributed computing at essentially no cost. – Elise Carmichael, Lakeside Software”

 

For Full Article, Click Here

Problem:           

Sometimes when running a CTP patch install preview GENDIR/bin/lawappinstall preview <productline> , the program is executing lasetup with the preview option, and is displaying the following error:

ERROR – failed to uncompress “patch.tar.Z” file.

Installation YEAREND126174.preview of YEAREND126174 terminated abnormally (start = 12/20/2023 13:27:01, stop = 12/20/2023 13:27:01).

ERROR – lasetup execution unsuccessful.

lawappinstall PREVIEW YEAREND126174.preview installation completed unsuccessfully at 12/20/2023 13:27:01.

 

Resolution:     

Follow these simple steps to resolve the issue above.

  1. Backup the current LUU directory
  2. Create a new blank folder for LUU
  3. Update the pl program to LUU
  4. Run the following command:

perl LUUsetup.pl -c E:\LUU

  1. Finally, run the CTP preview again. There should be no more errors.

These days, many businesses are moving their systems to the cloud from on-premise solutions. The reason for this is because of the many benefits cloud computing has to offer. Key offerings of being on the cloud include databases, infrastructure, platforms, software and storage capabilities that scale to seamlessly meet different operational requirements. One such benefit for being on the cloud is the utilization of artificial intelligence (AI) and business process automation. Will Quinn , Director of Global WMS Strategy at Infor, shares an article on Informatique News explaining the circumstances of deploying AI on the cloud versus on-premises.  Quinn notes that AI deployment methods brings to light multiple issues and makes it possible to evaluate the advantages and disadvantages while highlighting the central role of security in defining the optimal approach for companies. whose activity relies on AI. Quinn state that the Cloud represents the ideal solution in most cases for 10 reasons below, as well as the disadvantages of AI being deployed on-premises:

Benefits of AI and Cloud 

  1. Cost-Effective Scaling: Cloud services facilitate cost-effective scalability of machine learning models without significant upfront investment, promoting flexibility.
  2. Reduced initial investment: AI deployed in the Cloud frees companies, particularly those with limited resources, from investing in expensive hardware.
  3. Great ease of deployment: the speed of deployment offered by the Cloud streamlines configuration processes, which promotes innovation and accelerates the launch of new projects.
  4. Improved security: Cloud service providers are investing in rigorous security protocols to offer their customers cutting-edge encryption and authentication mechanisms.
  5. Accessibility and collaboration: AI deployed in the Cloud facilitates access and encourages seamless collaboration between multiple users, thereby increasing project efficiency.
  6. Compliance: adopting Cloud services that comply with current standards ensures compliance with data protection regulations.
  7. Always up-to-date information: Regular updates and patches from cloud service providers reduce vulnerabilities and minimize the risk of data breaches.
  8. Distributed Backup: Because data is stored in the Cloud across multiple locations, the risk of data loss due to physical disasters or hardware failures is minimized.
  9. Expertise and monitoring: Cloud service providers employ security experts specializing in cyber monitoring and continuous threat response.
  10. Scalability and interoperability: Integrating AI in the Cloud into existing systems seamlessly ensures seamless operation and scalability.

Disadvantages of AI deployed on-premises

  • Higher initial investment  : implementing AI on-site requires significant investments in hardware, software and qualified personnel.
  • Limited scalability  : Scaling on-premises infrastructure poses some challenges, especially if there is a sudden increase in computing needs.
  • Maintenance and upkeep  : the responsibility for maintaining and upgrading equipment induces additional operational costs.
  • Technological obsolescence  : Rapid advances in AI equipment risk making on-premises configurations obsolete more quickly than solutions deployed in the Cloud.
  • Resource dependence  : Robust security requires the use of skilled operators, which puts a strain on company resources.
  • Physical security concerns  : On-premises setups are vulnerable to physical threats such as theft attempts or natural disasters.

 

For Full Article, Click Here

What do the Lawson Base Mingle Roles Control?

  • Infor-SuiteUser” is the end-user role. This is the default role assigned to all the users. Users with this role have access to the portal only. The portal is one of the components of the Infor Ming.le application. The portal consists of a top level header, an app switcher panel, search, the user menu, share, bookmarks, and a right panel (context/utility applications panel). The users with this role only do not have access to the social space or ION-related features.

 

  • The “MingleEnterprise” role provides access to the social space component of the Infor Ming.le application. The social space component consists of activity feeds, connections, and groups.

 

Users who have this role can do these actions:

    • View the activity feed page
    • Post messages to colleagues and groups
    • Create new groups
    • Connect to users and groups

 

  • MingleAdministrator” is the role assigned to users to have access to administration pages in Infor Ming.le.

 

By design, the “MingleAdministrator” role is added to all applications in the tenant. The user with this role can view all application icons on the App Switcher panel. The user’s ability to open the application and access functionality, however, is controlled by the application security.

 

Users who have this role can see the Admin Settings menu item under the profile menu.

 

Users who have this role can do these actions:

    • Manage applications
    • Manage context/utility applications
    • Manage drillbacks
    • Manage general settings
    • The user with the “MingleAdministrator” role also needs the “MingleEnterprise” role in order to administer some of the users’ related features in social space.

These users can also do these actions:

    • Manage users’ feeds and groups’ feeds
    • Delete any Infor Ming.le group
    • Deactivate the users and groups and also reactivate them

 

  • MingleIONEnabled” is a role that allows users to access ION-related features within Infor Ming.le.

ION-related components consist of alerts, tasks, ION notifications, and workflows.

 

Users who have this role can do these actions:

    • View alerts and perform all the actions in the alerts
    • View tasks and ION notifications and perform all the actions in the tasks and ION notifications
    • Alerts and Tasks options are displayed in the user menu for the users who have this role.

Infor NexusTM, the single-instance intelligent supply chain network platform providing unparalleled visibility and collaboration, recently announced a new product to help companies map their multi-tier supply chains and provide evidence to support compliance with regulations such as the German Supply Chain Due Diligence Act and Uyghur Forced Labor Prevention Act (UFLPA) and the French AGEC Law and to document substantiation for product claims.  Per the press release, this new application called Map and Trace, was co-developed with industry leader Burton Snowboards to help companies “connect the dots” between suppliers, their suppliers, and the transactional records supporting the chain of custody. Further, the process of mapping the suppliers and collecting the documentation is highly manual and first requires participation from the tier 1 supplier. Moreover, Map and Trace is the first application that will help Infor Nexus customers achieve greater transparency and traceability across the product lifecycle. Up next, Infor Nexus is expecting to launch a broader traceability solution, providing a digital product identifier and helping comply with regulations such as the EU Digital Product Passport.

 

For Full Article, Click Here

When it comes to cloud computing Amazon Web Services (AWS) is relatively synonymous with everything cloud and cloud security. Tony Bradley, cybersecurity expert and Editor-in-Chief of TechSpective, shares an article on Forbes highlighting AWS’s chief information security officer (CISO) Chris Betz’s insights into AWS’s security strategies, emphasizing the integral role of threat intelligence and the company’s deep-rooted security culture.

AWS’s Approach to Security – “Betz shared that AWS’s security philosophy is centered on proactive and comprehensive protection, treating security not as an afterthought but as a fundamental component of its services. Betz noted that AWS’s infrastructure itself acts as a sensor, providing a broad and deep view of potential threats. This capability enables AWS to respond swiftly and effectively, protecting its customers and enhancing the overall security of the Internet.”

The Importance of Threat Intelligence – “Threat intelligence is a cornerstone of AWS’s security strategy. By tracking and analyzing malicious activities, AWS can preemptively address threats before they impact customers. Betz provided an example of this approach with the tool MadPot, a honeypot framework that has analyzed billions of interactions with malicious actors. This analysis helps AWS to push threat intelligence automatically into AWS services like GuardDuty, AWS WAF, and AWS Shield​.”

The Shared Responsibility Model – “A key aspect of cloud security is the shared responsibility model, which delineates the security obligations of AWS and its customers. Betz emphasized that while AWS is responsible for securing the cloud infrastructure, customers must secure their applications and data within the cloud. This model requires a collaborative approach to security, moving beyond mere compliance to a partnership aimed at collective success.”

Cultivating a Security-First Culture – “Beyond technological solutions, AWS places a strong emphasis on cultivating a security-first culture. Betz described how security is embedded in every aspect of AWS’s operations, from leadership down to individual developers. He shared that AWS’s CEO and leadership team dedicate an hour each week to discuss security with various engineering teams. This practice ensures that security remains a top priority and that lessons learned are continuously integrated into AWS’s processes.”

Empowering Customers – “AWS’s commitment to security extends to empowering its customers. Betz told me that by providing advanced security tools and clear guidance, AWS helps customers secure their environments and protect their data. He also emphasized the importance of customer control, particularly in the context of generative AI and other emerging technologies. AWS ensures that customers have the tools and knowledge to safeguard their data, enabling them to innovate with confidence.”

 

For Full Article, Click Here

When the ADFS Token-Signing certificate is updated on the ADFS server, it will have to be imported to Lawson and Infor OS.  The networking team should let the Lawson team know when the certificate is being updated in ADFS.

Someone with admin rights on the ADFS instance will need to export the certificate and provide you with the “.cer” file before these tasks can be completed.

Update the Certificate in Lawson

Log onto the Lawson Server

Start a ssoconfig -c session

Go to “Manage WS Federation Settings” > “Manage Certificates”

Select “Delete WS Federation Certificate”

Select “Create certificate for “WS Federation”

Select “Delete IdP certificate”

Enter the service name of your ADFS service (if you are unsure, export all the services and look for the one that redirects to  your ADFS server).

Select “Import IdP Certificate”

Enter the service name of your ADFS service

Provide the full path where you have the token-signing certificate saved

Reboot the server

Update the Certificate in Infor OS

Log into the Infor OS server as the LAWSON user

Log into the InforOSManager (should be an icon on the desktop)

Go to Identity providers on the left side

Double-click on your IdP

Select “From URL” to import the new certificate and metadata

Provide the URL: https://<your adfs server>/federationmetadata/2007-06/federationmetadata.xml

Click “Load”

Make sure the certificates load (there may only be one, but there should be at least one)

Reboot the server

 

 

As the today’s digital landscape continue to evolve, so do cyber threats and the ongoing challenges organizations must face. Emil Sayegh, CEO of Ntirety and cybersecurity expert, shares an article on Forbes about the evolving role of cybersecurity in an ever-changing world. From sophisticated cyberattacks to internal vulnerabilities, Sayegh notes that threat complexity is escalating and creating pervasive and multifaceted risks. This in turn requires innovative solutions, prompting a shift in traditional security paradigms towards a more integrated, data-driven approach.

Security Silos No More. – “The days of siloed security operations are behind us. Cybersecurity is now a critical conversation occurring at the highest levels of business and being intricately woven into every facet of operations.”

Navigating Internal and External Threats with Agility – “Organizations must contend with external hackers and internal employees who misuse resources (consciously or unconsciously) or engage in nefarious activities. The adoption of zero-trust models and emphasis on identity threat management in the face of these risks exemplifies a shift towards more sophisticated, data-driven security practices. These approaches not only defend against known threats, but also anticipate and mitigate potential vulnerabilities from within.”

Beyond Traditional Defenses: Embracing Comprehensive Security – “Modern security has evolved from a peripheral concern to a central element of strategic business planning. The harsh reality is that companies can now face closure due to a security breach, as demonstrated by numerous unfortunate instances. This shift signifies a transition from conventional security protocols to a comprehensive security model that integrates every facet of organizational operations. Through such integration, organizations enhance their ability to effectively anticipate, respond to, and recover from cyber threats.”

The Elusive Cybersecurity Nirvana – “Technological advancements, such as artificial intelligence (AI) and machine learning (ML), have revolutionized security monitoring. These technologies enable organizations to detect and respond to threats more efficiently by analyzing vast amounts of data to identify patterns and predict potential security incidents. Comprehensive security encompasses a multifaceted approach that extends beyond these technological defenses to include policy, governance, and human factors. It blends business acumen with security expertise, integrating solutions into an interconnected system that supports business continuity and creates value.”

Building a Resilient Future – “As organizations navigate the intricacies of the threat landscape, cybersecurity success will be defined by an emphasis on risk and resilience, alongside a proactive, data-driven approach. This integration of security monitoring services within a comprehensive security framework represents a pivotal shift in how organizations approach cyber defense. By seamlessly combining advanced monitoring capabilities, strategic planning, and a profound understanding of business operations, organizations can establish a resilient security posture.”

 

For Full Article, Click Here