{ "cells": [ { "cell_type": "markdown", "id": "85afd2e5", "metadata": {}, "source": [ "# Standalone example" ] }, { "cell_type": "markdown", "id": "89aaa688", "metadata": {}, "source": [ "## Introduction" ] }, { "cell_type": "markdown", "id": "c9b29270", "metadata": {}, "source": [ "An example that uses the PyGranta Data Flow Extensions package to interact with a resource that isn't part of a\n", "standard Granta MI system." ] }, { "cell_type": "markdown", "id": "0952195f", "metadata": {}, "source": [ "This example script logs the record identifying information, which is received from MI Data Flow. This could be\n", "replaced with any other business logic which can make use of the data provided by MI Data Flow. To perform operations\n", "that rely on additional information from a Granta MI system, see the other examples in this package." ] }, { "cell_type": "markdown", "id": "0a045954", "metadata": {}, "source": [ "### Useful links\n", "* [Recommended script structure](../user_guide/index.rst#recommended-script-structure)\n", "* [Business logic development best practice](../user_guide/index.rst#business-logic-development-best-practice)" ] }, { "cell_type": "markdown", "id": "eabe2044", "metadata": {}, "source": [ "
\n", "\n", "**Warning:**\n", "\n", "The `step_logic()` function generates the dataflow payload, and explicitly calls the `get_payload_as_str()` method\n", "with `include_credentials=False` to avoid logging credentials. If you are using Basic or OIDC Authentication and\n", "require these credentials for your business logic, inject these credentials into the\n", "`dataflow_payload[\"AuthorizationHeader\"]` value in the `testing()` function directly, for example via an environment\n", "variable.\n", "
" ] }, { "cell_type": "markdown", "id": "ce84628a", "metadata": {}, "source": [ "## Example script" ] }, { "cell_type": "code", "execution_count": null, "id": "e0beccd0", "metadata": {}, "outputs": [], "source": [ "import logging\n", "import traceback\n", "\n", "from ansys.grantami.dataflow_extensions import MIDataflowIntegration\n", "\n", "# Create an instance of the root logger\n", "logger = logging.getLogger()\n", "logger.setLevel(logging.INFO)\n", "\n", "# Add a StreamHandler to write the output to stderr\n", "ch = logging.StreamHandler()\n", "formatter = logging.Formatter(\"%(asctime)s - %(name)s - %(levelname)s - %(message)s\")\n", "ch.setFormatter(formatter)\n", "logger.addHandler(ch)\n", "\n", "\n", "def main():\n", " \"\"\"\n", " Initializes the MI Data Flow integration module, runs the business logic,\n", " and cleans up once execution has completed.\n", " \"\"\"\n", "\n", " # Ansys strongly recommend using HTTPS in production environments.\n", " # If you are using an internal certificate, you should specify the\n", " # CA certificate with certificate_filename=my_cert_file.crt and add the\n", " # certificate to the workflow as a supporting file, or use an absolute\n", " # pathlib.Path object to the file on disk.\n", " # Refer to the MIDataflowIntegration API reference page for more details.\n", " dataflow_integration = MIDataflowIntegration(use_https=False)\n", "\n", " try:\n", " step_logic(dataflow_integration)\n", " exit_code = 0\n", " except Exception:\n", " traceback.print_exc()\n", " exit_code = 1\n", " dataflow_integration.resume_bookmark(exit_code)\n", "\n", "\n", "def testing():\n", " \"\"\"Contains a static copy of a Data Flow data payload for testing purposes\"\"\"\n", "\n", " dataflow_payload = {\n", " \"WorkflowId\": \"67eb55ff-363a-42c7-9793-df363f1ecc83\",\n", " \"WorkflowDefinitionId\": \"Example; Version=1.0.0.0\",\n", " \"TransitionName\": \"Python_83e51914-3752-40d0-8350-c096674873e2\",\n", " \"Record\": {\n", " \"Database\": \"MI_Training\",\n", " \"Table\": \"Metals Pedigree\",\n", " \"RecordHistoryGuid\": \"d2f51a3d-c274-4a1e-b7c9-8ba2976202cc\",\n", " },\n", " \"WorkflowUrl\": \"http://my_server_name/mi_dataflow\",\n", " \"AuthorizationHeader\": \"\",\n", " \"ClientCredentialType\": \"Windows\",\n", " \"Attributes\": {\n", " \"Record\": {\"Value\": [\"d2f51a3d-c274-4a1e-b7c9-8ba2976202cc+MI_Training\"]},\n", " \"TransitionId\": {\"Value\": \"9f1bf6e7-0b05-4cd3-ac61-1d2d11a1d351\"},\n", " },\n", " \"CustomValues\": {},\n", " }\n", "\n", " # Call MIDataflowIntegration constructor with \"dataflow_payload\" argument\n", " # instead of reading data from Data Flow.\n", " dataflow_integration = MIDataflowIntegration.from_dict_payload(\n", " dataflow_payload=dataflow_payload,\n", " use_https=False,\n", " )\n", " step_logic(dataflow_integration)\n", "\n", "\n", "def step_logic(dataflow_integration):\n", " \"\"\"Contains the business logic to be run as part of the workflow.\n", "\n", " Replace the code in this module with your custom business logic.\"\"\"\n", "\n", " # Get the payload from the integration option\n", " payload = dataflow_integration.get_payload_as_string(\n", " include_credentials=False,\n", " )\n", "\n", " # Log the payload. All log messages will appear in the Data Flow log.\n", " logger.info(\"Writing dataflow payload.\")\n", " logger.info(payload)\n", "\n", "\n", "if __name__ == \"__main__\":\n", " # main() # Used when running the script as part of a workflow\n", " testing() # Used when testing the script manually" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" } }, "nbformat": 4, "nbformat_minor": 5 }