Award-winning boutique hotels, luxury resorts, and unforgettable adventures Noble House Hotels & Resorts recently selected Infor Hospitality Management Solution (HMS) for its property management system (PMS) of record. Deciding to remove its existing PMS and fully standardize on the Infor cloud-based platform is part of the brand’s project to fully integrate a PMS built for the cloud with flexibility, security, efficiency and mobile capabilities – all of which Infor HMS offers. Per the press release, Noble House has already completed successful go-lives at 10 properties, with an additional 12 slated for 2024. With Infor, Noble House has instant access to modern technology to unify and refine operations, deliver superior guest experiences, and implement and execute intelligent strategy to continue to deliver the unexpected to guests at its hand-selected portfolio of experience-driven hotels and resorts. Additionally, Infor HMS will serve as the hub of hotel operations for Noble House, providing a holistic technology platform that will connect every aspect of operations to the guest experience, creating a seamless journey that will help the organization continue to meet and exceed the highest standards.

 

For Full Article, Click Here

In the ever growing world of healthcare, big data is very important to protect and access. But some hospitals and health systems often underestimate and/or under budget for their data storage and archiving plans leading to costly mistakes and greater challenges down the road. While cost is essential and time is critical, there should be no shortcuts in selecting a data archiving strategy for your hospital’s data system. Having a carefully thought out budget and clear timeline should be the first step in your data archiving journey. Dr. Shelly Disser, Vice President of Innovation and Collaboration at MediQuant, the leading innovator and provider of enterprise-active archiving solutions to hospitals and health systems, shares an informative article at HIT Consultant on why budget alignment is critical for stakeholders to consider in a hospital’s data archiving plan.

From the moment a new electronic health record (EHR) system is selected the data archiving journey begins, says Dr. Disser. She breaks down that at surface glance, the data archiving process involves three main steps: data extraction, data conversion and data migration. What happens in between is so complex, which is where you cannot skimp on costs, time and experience. While there is no “one-size-fits-all” system for data archiving, there are some best practices to follow to have a solid archiving plan. Dr. Disser explains them below:

  1. Establish Long-Term GoalsPrioritize Managing Legacy System Data.  “Start the archiving project along with the EHR transition by setting definite goals and a distinct plan with a tentative go-live date. This can include discontinuing legacy applications, creating a unified archive for all patient data, seamlessly integrating it with the new EHR, ensuring comprehensive record access for clinicians, meeting regulatory storage requirements, expediting medical information requests, and preparing for future data needs and regulations.”
  2. Set a Budget That Matches the Goals. “The budget and goals should account for what each stakeholder needs to do with the data today and in the future. For example, non-discrete data archiving – involving file types such as PDFs and images – is less expensive but has critical limitations. By comparison, discrete data helps with use cases across teams and has the flexibility to allow for future demands. It can ensure success with things like HIM requests and complete compliance with Information Blocking. The wrong archival procedure to meet these needs often leads organizations to revamp and revise later. Beware of automated archiving methods that extract data in one large PDF. These are indeed fast and cheap but do not make crucial information either searchable or useable.”
  3. Create an Archiving Project Team. “Effective planning involves assembling a diverse team of stakeholders for an archiving project, drawing from various departments including clinicians, IT, HIM, legal, finance, and external resources if needed. Subject matter experts play a crucial role in understanding workflow intricacies and aiding in data validation. It’s essential to engage these experts early in the process, before they transition to new systems. This interdisciplinary team structure ensures a comprehensive approach to project oversight, data entry, and database management. However, it also requires careful consideration of the team’s existing commitments and potential external resource requirements, all of which should be factored into the budgeting process to prevent undue disruptions to the project or workflow.”

It is very important to note that is no substitution for experience and methodology when choosing an archiving partner. Again, time and money will be a factor for stakeholders and decision-makers, but having the rightfully experience archiving partner will pay for itself in having a solid and smooth archiving plan. You also should trust that your archiving partner knows what to look for and how to extract your data when it comes to the start of the project. Nothing is more valuable than data for hospitals and any business in fact. There are no shortcuts to securing, accessing and properly storing your data. Dr. Disser concludes that choosing the best archiving partner for collaboration and the alignment of collective objectives among stakeholders will result in optimized patient care while meeting compliance, legal and research needs – now and in the future.

 

For Full Article, Click Here

The IPA Services for Infor Lawson product (lpaservice.jar) contains the definition for each service.

These steps describe how to download, install and configure the services.

Download and install the services

  1. Download IPA Services for S3 from the Infor Product Download Center.
  2. Transfer the .jar file to your IPA server.
  3. Extract the contents of the .jar file to a new temporary directory.

jar –xvf lpaservice.jar

  1. Run the installation program from the temporary directory. Use the data area created in Landmark for IPA.

perl install-lpaservice.pl dataarea

  1. In order for Lawson applications to create workunits in IPA, standard workflow service data must be loaded onto the System Foundation (LSF) server.

In the temporary directory that you created, you will see the following directory structure and file:

YourTempDir/lsf/ wf-data-10.exp

Copy this file to your LSF environment in your normal install directory.

  1. Run the following command:

perl GENDIR/bin/pflowimpexp.pl imp wf-data-10.exp

NoteIf the installation provides feedback that some data was duplicated, you can ignore the message. This means that some or all of the workflow data already existed on the server which is not a problem.

  1. Verify that the services imported. View them on the IPA server. Navigate to: Start > Applications > Process Server Administrator > Configuration > Service Definitions

 

Configure email properties for IPA services

  1. From the IPA administration console on the IPA server, navigate to Configuration > System Configuration.
  2. Locate a configuration set named “ProcessFlowSolutions”.
  3. Verify that the configuration set should contains the following properties:
    • FromEmailAddress
    • ToEmailAddress

If the configuration set and/or the two properties do not exist, either re-install the product OR manually create the configuration to add the properties. Be sure to name them as shown above.

  1. Set the values of the properties to be meaningful for your organization.
  2. When you are finished updating, save the configuration.

 

Infor recently announced that Emmi, a leading dairy product manufacturer based in Lucerne, Switzerland, has implemented Infor CloudSuite Food & Beverage, powered by Amazon Web Services (AWS). Per the press release, Emmi has replaced its previous on-premises enterprise resource planning (ERP) infrastructure with the industry-specific multi-tenant platform at five locations within just 6.5 months. Additionally, Infor’s public cloud strategy enables Emmi to benefit directly from proven industry best practices at the international acquisitions. Their locations in Canada, United Kingdom and Benelux are working with Infor CloudSuite Food & Beverage, as well as two subsidiaries in Turlock and Sebastopol, California, in the United States. More subsidiaries are being deployed in Infor CloudSuite Food & Beverage. The greatest benefit to this implementation for Emmi is the excellent scalability of the cloud platform, which allows new branches to be connected easily, as well as the lower costs in operation. In addition, the multitenant-capable solution is always up to date due to regular real-time updates. Emmi can benefit particularly from the newest functions in the areas of finance and reconciliation. Further, the out-of-the-box functionalities for the dairy and cheese industry enabled a fast and efficient implementation at Emmi. These include the receipt of milk in fat, protein and dry matter components, push-pull planning for milk and dairy products, consideration of variable weight in cheese production and seamless batch traceability.

 

For Full Article, Click Here

Problem:

I am running Requisition Center (RQC) and I need to clear my user’s cookie data due to errors or issues accessing RQC.

 

Resolution:

The cookie and XML data is stored in your RQC work directory. This directory is defined in the LAWDIR/system/rqc_config.xml file. It is most commonly defined as “LAWDIR/RQC_work”.

 

To delete the cookie and XML data, follow the steps below:

  1. Change to your rqc_work directory.
  2. Delete all files for the web user(s) (e.g., rm userid*)
  3. Change to your rqc_work/cookies directory.
  4. Delete all files for the web user(s) (e.g., rm userid*)
  5. Change to your rqc_work/xml directory (if it exists).
  6. Delete all files for the web user(s) (e.g., rm userid*)
  7. Run the following URL to perform IOS Cache Refresh: https://hostname/servlet/IOSCacheRefresh
  8. Run the following URL to perform RQC Cache Refresh: https://hostname:port/rqc/html/utility.htm
  9. Have the user delete their browser cache ensuring that ‘All time’ is selected. See linked KB 1188792: “How to clear your browser cache”.
  10. Have the user enter RQC and immediately click on the NEW button.

Infor recently announced that Bahrain’s Arab Shipbuilding and Repair Yard Co. (ASRY), the leading ship and rig repair yard in the Arabian Gulf, has gone live with Infor CloudSuite Industrial Enterprise. This powerful cloud-based enterprise resource planning (ERP) system enables the company to digitize its business while raising its efficiency, sustainability and ability to innovate. Per the press release, through installing Infor CloudSuite Industrial Enterprise, ASRY will be able to keep track of orders involving thousands of specialist parts and components, both for servicing customers and for running and maintaining its own operations specially concerning its substantial scale and scope of business. Additionally, by deploying Infor CloudSuite Industrial Enterprise, ASRY has unified and automated all of its major business functions, including procurement, accounts, and supply chain management. This has enabled it to improve its efficiency, while giving management a single source of truth for all business transactions. The solution allows ASRY to seamlessly integrate with its partners, allowing it to secure components more quickly, thus improving repair and fabrication times, and enhancing customer experience. Moreover, the move to a multi-tenant cloud environment also means ASRY benefits from continuous innovation from Infor, with software updates and upgrades taking place automatically as soon as Infor implements them. The solution was implemented by Infor’s Consulting Services team.

 

For Full Article, Click Here

In today’s business landscape, data is the driving force. Effective data management is thus critical at the core of an organization. With growing data, businesses need to be prepared and implement strategies to efficiently collect, store, process and utilize their data. Marketing strategist Ovais Naseem shares an informative article on Data Science Central on how to master efficient data management. Below, we will explore Naseem’s best data management strategies to help your organization harness the power of data for informed decision-making, improved customer experiences, and competitive advantage.

Firstly, you must establish clear data objectives. What is your goal for having all this data? is it customer insights, streamlining operations, product development, or something else? Armed with these well-defined objectives, Naseem says, you can strategically shape your data management initiatives to systematically gather and oversee the most pertinent and invaluable data.

Ensuring data governance and quality in data management strategies. “Data governance involves establishing policies, procedures, and practices for data management. Additionally, it ensures data accuracy, consistency, and security. To maintain high data quality, consider implementing the following strategies in detail: data quality management, data validation and verification, data cleansing, and data classification: Furthermore, categorize data based on importance, sensitivity, and regulatory requirements. This step is crucial for setting access controls and implementing security measures accordingly.”

Data security. “Data breaches, with their potential for financial loss and reputational damage, underscore the importance of prioritizing data security. To delve into the intricacies of safeguarding data, consider the following strategies: encryption, access control, and regular audits.

Selecting the right tools for efficient data management. “Selecting the right data management tools is crucial. The choice largely depends on your data’s nature and volume. Delve into the details of data storage solutions: relational databases, NoSQL databases, and cloud storage.

Data integration. “In many organizations, data is distributed across various systems and formats. Data integration is the process of combining data from diverse sources into a unified view. Dive into the intricacies of data integration with these strategies: ETL (Extract, Transform, Load) processes, data warehouses, and API integrations.

Data backup and recovery.  “Data loss can be disastrous. Implement a comprehensive backup and recovery strategy, paying attention to the following details: regular backups, redundancy, and a disaster recovery plan.”

Data lifecycle management. “Not all data is equally valuable or relevant. Implementing data lifecycle management helps you prioritize data based on its importance and use. Explore the stages of data lifecycle management in detail: data creation and collection, data storage and access, data archiving, and data deletion.”

Data documentation and metadata. “Comprehensive documentation and metadata management are vital components of data management. Moreover, metadata provides valuable context and information about your data, making it easier to understand and use. It is crucial to pay attention to these details in data documentation:

  • regarding metadata structure, it is essential to develop a standardized format that includes details such as data source, format, creation date, update history, and usage instructions
  • Additionally, focusing on data lineage is crucial. It involves tracking the origin and journey of data through various systems and processes, ensuring transparency and accountability”

Data privacy and compliance. “In an era of strict data regulations like GDPR, CCPA, and HIPAA, prioritizing data privacy and compliance is non-negotiable. Delve into the complexities of data privacy with these strategies: data classification, consent management, and compliance audits.”

Data analytics and reporting. “Data management isn’t solely about storing and securing data; instead, it’s also about deriving insights from it. To delve into the detailed aspects of data analytics and reporting, consider the following strategies: data analytics tools, data visualization, and data-driven decision-making.”

Data management, done effectively and efficiently, is the cornerstone of success for businesses today. By implementing Naseem’s strategies, you’ll be well-equipped to make informed decisions, optimize operations, and stay ahead of the competition in the data-driven world.

 

For Full Article, Click Here

The cookie and XML data is stored in the RQC work directory. This directory is defined in the LAWDIR/system/rqc_config.xml file.

To delete the cookie and XML data, follow the steps below:

  1. Change to your rqc_work directory.
  2. Delete all files for the web user(s) (e.g., rm userid*)
  3. Change to your rqc_work/cookies directory.
  4. Delete all files for the web user(s) (e.g., rm userid*)
  5. Change to your rqc_work/xml directory (if it exists).
  6. Delete all files for the web user(s) (e.g., rm userid*)
  7. Run the following URL to perform IOS Cache Refresh: https://hostname/servlet/IOSCacheRefresh
  8. Run the following URL to perform RQC Cache Refresh: https://hostname:port/rqc/html/utility.htm
  9. Have the user delete their browser cache ensuring that ‘All time’ is selected. See linked KB 1188792: “How to clear your browser cache”.
  10. Have the user enter RQC and immediately click on the NEW button.