Infor has identified Thailand’s manufacturing sector as a key potential market in Southeast Asia as the country continues to digitize under the Thailand 4.0 policy, its CEO Kevin Samuelson shared with The Nation in a virtual interview, written by Nongluck Ajanapanya. Per the article, Samuelson’s remarks, during a trip to Bangkok on June 20, addressed the country’s digital transformation journey, the technology landscape, and the role and trend of artificial intelligence (AI), particularly the emergence of generative AI (GenAI). He also discussed the strategy for using AI and cloud technology to drive innovation and efficiency in Thailand’s business landscape. Ajanapanya notes that the CEO recognized the gradual nature of AI’s impact, anticipating a significant increase in productivity, paving the way for a future in which work is focused on more rewarding and challenging tasks. Samuelson also emphasized the importance of identifying specific industry use cases to generate value from AI, citing an example of how Infor’s AI solutions helped a forklift company improve efficiency, lower costs and boost sales. “Discussing technological trends in the enterprise resource planning (ERP) industry,” Ajanapanya writes, “Samuelson underlined the importance of moving to the cloud, citing benefits in flexibility, data security, and efficiency. He also noted a growing trend of process and shop floor automation, which leads to increased efficiency and better results.” Moreover, in terms of Infor’s growth strategy in Thailand, Samuelson disclosed that the company’s focus on specific industries enables it to provide tailored solutions to businesses with smaller budgets. “As Thailand embraces digital transformation,” Samuelson states, “Infor’s targeted approach and AI-driven solutions position it to play an important role in shaping the country’s manufacturing and distribution sectors.”

 

For Full Article, Click Here

Installing LDAP certificate in AD LDS instance

  1. Identify the AD LDS service instance in Services
    • LSF
  2. Launch MMC (Microsoft Management Console)
  3. Choose File > Add/Remove Snap-In
  4. Add the certificates Snap-In
  5. Choose “Service” account and click “Next”
  6. Choose “Local Computer” and click “Next”
  7. Choose the Service Account for your AD LDS service and click “Finish”

  8. Right-click on the service that was added and select “All Tasks > Import”
  9. Click next and browse to the .pfx certificate file. Click “Next”
  10. Enter the private key password
  11. Place the certificate in the <AD LDS service>\Personal store
  12. Click Next then Finish

 

 

Export the certificate for Java OS & Java WebSphere

  1. Right click the certificate > All Tasks > Export and click Next
  2. Do not export the private key
  3. Choose Base-64 encoded X.509 (.cer) and click Next
  4. Choose a location to save the file for later use
  5. Click finish

Grant Permissions to Certificate Container

  1. Run command “certutil -store MY’
  2. Find the container with your AD LDS certificate using the thumbprint to identify it
  3. Give NETWORK SERVICE read & execute permissions on the key container file AND the key container directory (C:\ProgramData\Microsoft\Crypto\RSA\MachineKeys)
  4. Stop the AD LDS environment service
  5. Restart the AD LDS service

Smoke Test

  1. Open the ldp.exe tool
  2. Type the server FQDN > SSL port and check the SSL box
  3. Click “OK”
  4. Successful connection to LDAPS

Update the LDAP Certificate in WebSphere

Cell Trust Store

  1. Access the WAS Admin Console and navigate to: Security > SSL certificate and key management > Key stores and certificates > CellDefaultTrustStore > Signer certificates
  2. Click the Retrieve from port button.
  3. Host: <your AD LDS host>
  4. Port: 636
  5. Alias: give it a meaningful name
  6. Click Retrieve signer information.
  7. Click OK & save changes.

 

Node Trust Store

  1. Access the WAS Admin Console and navigate to: Security > SSL certificate and key management > Key stores and certificates > NodeDefaultTrustStore (for the LSF server) > Signer certificates
  2. Click the Retrieve from port button.
  3. Host: <your AD LDS host>
  4. Port: 636
  5. Alias: give it a meaningful name
  6. Click Retrieve signer information.
  7. Click OK & save changes.

 

Perform these same steps in the Landmark websphere instance.

Update LDAP Certificate in OS Java

Do this in both Lawson and Landmark

  1. Open a command line and set environment variables
  2. Run command “where java” to determine where LAW_JAVA_HOME is located
  3. Back up <LAW_JAVA_HOME>/jre/lib/security/cacerts
  4. Copy the cert that you exported from the LSF service from the Lawson server to the Landmark server
    • This is the cert you will be importing into cacerts
  5. Run the ikeyman utility at WAS_HOME/bin
  6. Open the LAW_JAVA_HOME/jre/lib/cacerts file and select the Key database type of JKS
  7. Type password “changeit” (default)
  8. Select “Signer Certificates”
  9. Delete the existing certificate, then re-add it
  10. Click “add” and navigate to the ldap certificate exported earlier
  11. Give it a meaningful name

Update LDAP Certificate in WebSphere Java

  1. WebSphere Java directory is WAS_HOME/Java
  2. Back up files WAS_HOME/java/jre\security/cacerts
  3. Peform the same steps as OS Java using iKeyman for both Java instances

 

Infor was listed as the only vendor named as a Leader in all four of the following IDC MarketScape reports – IDC MarketScape: Asia/Pacific SaaS and Cloud-Enabled SMB/Midmarket Segment ERP 2024 Vendor Assessment; IDC MarketScape: Worldwide SaaS and Cloud-Enabled Small Business ERP Applications 2024 Vendor Assessment; IDC MarketScape: Worldwide SaaS and Cloud-Enabled Large Enterprise ERP 2023–2024 Vendor Assessment; and IDC MarketScape: Worldwide SaaS and Cloud-Enabled Medium-Sized Business ERP Applications 2024 Vendor Assessment. Per the press release, the IDC MarketScape reports evaluate vendors based on a comprehensive and rigorous framework relative to the criteria and one another, and key strategy criteria included both qualitative and quantitative data resulting in a single graphical illustration of each vendor’s position within a given market. Infor was recognized for its cloud-platform, its ability to scale, and overall industry expertise. Mickey North Rizza, group VP, IDC’s Enterprise Software, said “The digital world has reshaped businesses’ of all sizes focus on moving to the cloud to improve their speed, scale, agility, market share, and competitive advantage. This requires adapting new ERP technologies that enable speed and scale by reducing process steps and clicks, automating every workflow possible, embedding finance to collect and make payments, and helping improve overall decision velocity.” Further, Infor cloud ERP solutions deliver industry-specific capabilities without extensive customizations or integrations by combining the Infor cloud platform built on infrastructure services from Amazon Web Services® (AWS®) and Infor OS. By migrating critical business applications to the cloud, organizations experience automatic upgrades that deliver the latest advances in enterprise functionality.

 

For Full Article, Click Here

Resolution 1:

One reason you could be receiving this error is because there is an additional patch.tar file from a previous or concurrent CTP install.

 

After running the tar command you should only have 3 files for this CTP in the <versionfiledir> that you uncompressed the CTP to.

Examine the extracted files to make sure you received the following three files:

– x.x.x_patch_CTPnumber.readme.html

– Versions

– patch.tar.Z

 

Remove any previous CTP files from this directory, especially any patch.tar files, and run the lawappinstall again.

 

Resolution 2:

If you are encountering this error on a Windows server it is possible that you have spaces in the folder names of the path to the versions dir.  You would receive the failed to uncompress message if this is true.

 

Use a “_” for the space, or use folders that do not contain a space in the name and run the lawappinstall again.

 

Resolution 3:

Make sure the user you are applying the patch with has the proper Windows file permissions to install the patch. This should be the entire LSF application directory.

In today’s ever-growing digital landscape it is imperative that businesses equip themselves with the essential tools and insights that can help navigate the complex terrain of digital transformation. Ekaterina Dudakova, News Analyst at ERP Today, shares an article that explores these opportunities through a session with Thomas Iseler, global strategic partnerships advisor at Data Migration International (DMI), about his views on the transformation possibilities in retrospect of the SAP Adoption Catalyst event held in the spring of this year. When asked about the key takeaway customers should note from the Catalyst event, Iseler focused specifically on data. Oftentimes, Dudakova notes, customers tend to not think about data as first and foremost matter, but rather leave it as a “difficult and painful” topic that “you deal with when you have to”. Iseler explained that this tends to happen because of how much is at stake when it comes to dealing with data: downtime planning is a topic that is seen as particularly difficult because it involves business interruption with users deciding which data they want. He adds, “During that process, it is common that ‘you will never get a straight answer. It will change five million times and it’s probably always more data than [end users] really need'”. There is a different approach to this where DMI unveiled at the Catalyst event the JiVS Information Management Platform (JiVS IMP) – an SAP-certified solution for S/4HANA migration, aimed to help companies halve their migration workloads, slash operating costs and bolster security measures. At the event, it was explained that JiVS IMP is an end-to-end information management platform that ‘divides and conquers’ customers’ data, separating operational and historical data to reduce the burden on the live systems. “The platform cleans, sorts and refines the data to preserve its quality,” adds Iseler “This way, the historical data and documents are stored in JiVS IMP and can be accessed from both inside and outside SAP. Moreover, Dudakova states that as the cloud gradually overtakes ERP (enterprise resource planning) technology, it is vital to consider how digital transformation will affect your data. “With a solution like DMI’s JiVS IMP,” she concludes, “you can be confident that your data will be stored safely and your cloud migration journey is smooth and efficient.”

 

For Full Article, Click Here

Problem:           

Patch being installed if failing when running lawappinstall activate. It fails when running ujobload.

10/31/2023 5:44:25 Executing ujobdump.

10/31/2023 5:44:25 ujobdump execution successful.

10/31/2023 5:44:25 Executing ujobload.

10/31/2023 5:44:26 ERROR – ujobload failed.

ujobload via lawappinstall activate *** No jobs found to load When run manually, it fails with Segmentation Fault(coredump:

 

Resolution:     

lawappinstall update will stage tokens potentially needing a ujobdump/ujobload in LAWDIR/productline/backup/ACTIVATEstage/JOBconversion.

If conditions are correct, lawappinstall activate will run ujobdump and ujobload, then clean up the staged area.

  1. ujobdump -d LAWDIR/productline/backup/ACTIVATEstage/JOBconversion productline $LAWDIR/productline/backup/ACTIVATEstage/JOBconversion.dmp -t <list of Tokens>

In the above <list of Tokens> would be a space-separated list of the tokens located in $LAWDIR/productline/backup/ACTIVATEstage/JOBconversion/??src directories.

  1. ujobload -ou productline LAWDIR/productline/backup/ACTIVATEstage/JOBconversion.dmp
  2. remove dump file, LAWDIR/productline/backup/ACTIVATEstage/JOBconversion.dmp
  3. remove stage dir, LAWDIR/productline/backup/ACTIVATEstage/JOBconversion

Run steps a through d manually, then rerun lawappinstall activate.   If the ujobload fails in step b with a Segmentation Fault(coredump) or other error, make sure the user running ujobload has write access to these files and their directories.

LAWDIR/UJobLoadDir/productline/Tokens

LAWDIR/productline/UJobLoadDir/LDLog

Make corrections if necessary, run steps b through d above, and rerun lawappinstall activate.

Many businesses worldwide are beginning to allow more artificial intelligence (AI) in their business systems. In the Philippines, experimenting with AI is taking time but promises growth and increased productivity for businesses there. Technology Content Writer Dawn Solano shares an article on PhilSTAR Tech of an interview with Infor CRO Kevin Samuelson and his thoughts on AI’s impact in the Philippine’s business landscape.  “AI, being the ‘next frontier for technology’ will encourage creativity that Filipino businesses and businesses in most countries should explore,” he said. In addition, he told PhilSTAR Tech “Those that are experimenting and learning will be those that end up ultimately benefiting the most.” Infor has used AI in their cloud software services for years and had benefitted both themselves and their clients. With their knowledge of AI in the tech industry, this has allowed the tech giant to apply more advanced methods for their clients, Solano notes from the interview. Additionally, when asked how would legacy businesses in the Philippines, that are deep into technology debt approach AI, Samuelson understood that it could be overwhelming yet not totally requiring deep technical expertise. Moreover, he states that there are a lot of third-party agencies that could help companies put ‘up to speed.’ “These hyperscalers have built this amazing amount of compute, put the algorithms in place,” Samuelson notes. “And so I think the evolution of the technology has also made it far easier to adopt.”

 

For Full Article, Click Here

One of the latest competitive advantages in today’s business digital landscape is the use of artificial intelligence (AI) and cloud computing efficiency. When done right, AI and cloud computing help drive business growth,  productivity and security without sacrificing costs or resource use. From innovative AI management tools to cutting-edge cloud optimization strategies, Forbes Technology Council members share some best practices below to help your business maximize performance and minimize costs.

  1. Remove Silos To Prepare For A Quantum Future – “A quantum future is approaching, and leaders need to prepare now. It has the potential to drive massive productivity on its own, but when paired with AI, it can create a powerhouse of efficiency and security. Leaders need to start removing silos now to inform all teams about how strategic adoption can drive long-term success. – William Briggs, Deloitte Consulting”
  2. Utilize AIOps To Predict Potential Issues –  “One holistic approach to enhance AI and cloud computing efficiency is automation through AIOps. Automation can predict potential issues before they impact business operations through data analysis, streamlined workflows and reduced risk of human error. As a monitoring and predictive capability, automation prevents downtime, optimizes resource allocation and enhances security and compliance. – Akhilesh Tripathi, Digitate”
  3. Optimize Efficiency And Minimize Resource Duplication With Containerization – “Containerization is one process that optimizes resource efficiency by isolating applications while sharing the same operating system. This process also minimizes the resource duplication and reduces the cloud costs. By using this, companies can test their AI models without wasting their valuable resources. – Asad Khan, LambdaTest Inc.”
  4. Reduce Waste And Costs With Serverless Computing Architectures – “One effective practice is adopting serverless computing architectures. This approach allows companies to run applications and services without managing servers, which optimizes AI workloads by dynamically allocating resources based on demand, thereby reducing waste and costs. – Savitri Sagar, Kenzo Infotech”
  5. Emphasize The Importance Of Data Quality – “One best practice for enhancing AI is implementing data-driven reports with full traceability, educating all employees on the importance of data quality and fostering a data-first culture over time. This approach ensures data sufficiency for effective AI usage, transforming organizational decision-making and enhancing strategic alignment and efficiency. – Ged Ossman, Interf”
  6. Look Toward The Future Of Space-Based Technologies – “In the future, space-based data centers will be used for cloud computing for terrestrial applications. These will be more efficient than earth-based data centers as they will be able to use passive radiative cooling and efficient laser-based networking, and can theoretically co-locate far more computing power than the terrestrial electrical grid allows. – Ezra Feilden, Lumen Orbit, Inc.”
  7. Explore Neuromorphic Computing For Complex AI Tasks – “Neuromorphic computing is an emerging technology that mimics the human brain’s structure and function, allowing for more efficient AI processing with lower energy consumption. Neuromorphic chips can help companies perform complex AI tasks at the edge, speeding up data processing and reducing cloud computing demands. – Sriram Panyam, DagKnows”
  8. Build In Escape Hatches For Common Workflows – “Companies should consider building in “escape hatches” for common workflows. For example, if you deploy an internal AI chatbot for HR questions, you could anticipate the most common requests (e.g., request a pay stub or tax docs), and rather than have an AI decide what to do, direct the user to the process you have for pay stubs and tax docs. – James Ding, DraftWise”
  9. Leverage Unified Solutions Over Point Solution-Based AI – “Like past technological waves, AI is setting up to be a race to the middle. That has played out with the cloud, and now we get less from each cloud investment, as people are overwhelmed by how much we’ve thrown at them. Leveraging unified versus point solution-based AI that sits on top of our cloud software can improve the efficiency of our cloud investments and ensure we get durable differentiation from AI. – Michael Haske, Krista.ai”
  10. Develop A Well-Defined Data Catalog – “A well-defined data catalog with rich metadata and semantics is crucial for spearheading GenAI-based solutions in companies. A data catalog enables more sophisticated data analytics and AI capabilities. It helps in automating the data preparation process and supports more advanced techniques like semantic search, which can be leveraged to build more powerful and effective GenAI solutions. – Faisal Fareed, Amazon Web Services”
  11. Use Smaller Language Models For A Lower Carbon Footprint – “Smaller language models are more efficient, secure and able to scale economically with a lower carbon footprint. These closed models can be pre-trained on specific domains with AI workflows for the most optimized business processes while bringing a curated collection of information specific to your enterprise. – Venky Yerrapotu, 4CRisk.ai”
  12. Segment AI Applications And Workloads Based On Cost – “AI is a diverse technology with many applications that will entail many different deployment options that will be constantly changing as edge AI and cloud AI continue to evolve. Tech leaders should segment AI applications and workloads based on the cost (which includes IT or CSP’s energy cost) and determine where they are best run on-device, across edge infrastructure and cloud. – Leonard Lee, neXt Curve”
  13. Consider Distributed Computing – “Distributed computing is certainly one of the key emerging technologies companies should consider to improve efficiency in their AI and cloud computing resource consumption. AI, especially generative AI and advanced AI research, can be incredibly computing-intensive. By applying advanced distributed computing techniques, businesses can orchestrate AI tasks to appropriate resources and reduce costs. – Humberto Farias, Concepta”
  14. Look Into Federated Learning To Reduce Centralized Storage Needs – “One emerging technology to improve AI and cloud computing efficiency is federated learning. It enables models to be trained across multiple decentralized devices or servers, reducing the need for centralized data storage and massive computational power. This approach not only enhances data security but also optimizes resource usage by leveraging the computing power of edge devices. – Jiahao Sun, FLock.io”
  15. Consider LLMOps To Optimize LLM Use – “Large language model operations (LLMOps) is an emerging category that can help provide for more efficient and optimized use of LLMs. Running these models can be expensive, which can make it challenging to achieve sufficient ROI as well as scale. LLMOps are essentially a framework that focuses on efficient resource allocation, tracking models, evaluating responses and improving inference. – Muddu Sudhakar, Aisera”
  16. Educate Employees On The Nuances Of Proper Prompts – “Education on prompt engineering best practices is the most important way to improve companies’ efficient use of AI. AI is rapidly evolving, but in today’s environment, chatbot interfaces and the prompts that are used to get valuable output from today’s LLMs are critical to deriving value. Most employees treat them like search engines, do not understand the nuances of prompts and need training. – Michael Keithley, United Talent Agency”
  17. Focus On The Fundamentals – “Zeroing in on the fundamentals of your business can be a powerful way to optimize resource consumption. By revisiting basic principles like clearly defining project goals, streamlining data pipelines and right-sizing cloud resources, you can ensure your AI and cloud investments are truly delivering value, facilitating efficiency gains and allowing teams to do more with less. – Todd Fisher, CallTrackingMetrics”
  18. Leverage Computing Power From Endpoint Devices – “Not everything needs to be done in the cloud. There are a lot of computing resources sitting idle at every organization, every single day. Leveraging the computing power from endpoint devices brings back distributed computing at essentially no cost. – Elise Carmichael, Lakeside Software”

 

For Full Article, Click Here

Problem:           

Sometimes when running a CTP patch install preview GENDIR/bin/lawappinstall preview <productline> , the program is executing lasetup with the preview option, and is displaying the following error:

ERROR – failed to uncompress “patch.tar.Z” file.

Installation YEAREND126174.preview of YEAREND126174 terminated abnormally (start = 12/20/2023 13:27:01, stop = 12/20/2023 13:27:01).

ERROR – lasetup execution unsuccessful.

lawappinstall PREVIEW YEAREND126174.preview installation completed unsuccessfully at 12/20/2023 13:27:01.

 

Resolution:     

Follow these simple steps to resolve the issue above.

  1. Backup the current LUU directory
  2. Create a new blank folder for LUU
  3. Update the pl program to LUU
  4. Run the following command:

perl LUUsetup.pl -c E:\LUU

  1. Finally, run the CTP preview again. There should be no more errors.

These days, many businesses are moving their systems to the cloud from on-premise solutions. The reason for this is because of the many benefits cloud computing has to offer. Key offerings of being on the cloud include databases, infrastructure, platforms, software and storage capabilities that scale to seamlessly meet different operational requirements. One such benefit for being on the cloud is the utilization of artificial intelligence (AI) and business process automation. Will Quinn , Director of Global WMS Strategy at Infor, shares an article on Informatique News explaining the circumstances of deploying AI on the cloud versus on-premises.  Quinn notes that AI deployment methods brings to light multiple issues and makes it possible to evaluate the advantages and disadvantages while highlighting the central role of security in defining the optimal approach for companies. whose activity relies on AI. Quinn state that the Cloud represents the ideal solution in most cases for 10 reasons below, as well as the disadvantages of AI being deployed on-premises:

Benefits of AI and Cloud 

  1. Cost-Effective Scaling: Cloud services facilitate cost-effective scalability of machine learning models without significant upfront investment, promoting flexibility.
  2. Reduced initial investment: AI deployed in the Cloud frees companies, particularly those with limited resources, from investing in expensive hardware.
  3. Great ease of deployment: the speed of deployment offered by the Cloud streamlines configuration processes, which promotes innovation and accelerates the launch of new projects.
  4. Improved security: Cloud service providers are investing in rigorous security protocols to offer their customers cutting-edge encryption and authentication mechanisms.
  5. Accessibility and collaboration: AI deployed in the Cloud facilitates access and encourages seamless collaboration between multiple users, thereby increasing project efficiency.
  6. Compliance: adopting Cloud services that comply with current standards ensures compliance with data protection regulations.
  7. Always up-to-date information: Regular updates and patches from cloud service providers reduce vulnerabilities and minimize the risk of data breaches.
  8. Distributed Backup: Because data is stored in the Cloud across multiple locations, the risk of data loss due to physical disasters or hardware failures is minimized.
  9. Expertise and monitoring: Cloud service providers employ security experts specializing in cyber monitoring and continuous threat response.
  10. Scalability and interoperability: Integrating AI in the Cloud into existing systems seamlessly ensures seamless operation and scalability.

Disadvantages of AI deployed on-premises

  • Higher initial investment  : implementing AI on-site requires significant investments in hardware, software and qualified personnel.
  • Limited scalability  : Scaling on-premises infrastructure poses some challenges, especially if there is a sudden increase in computing needs.
  • Maintenance and upkeep  : the responsibility for maintaining and upgrading equipment induces additional operational costs.
  • Technological obsolescence  : Rapid advances in AI equipment risk making on-premises configurations obsolete more quickly than solutions deployed in the Cloud.
  • Resource dependence  : Robust security requires the use of skilled operators, which puts a strain on company resources.
  • Physical security concerns  : On-premises setups are vulnerable to physical threats such as theft attempts or natural disasters.

 

For Full Article, Click Here