Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

I will use a custom internal application, Deeplinks, Scenario Catalogs, Webhooks, and the Scope GraphL API to retrieve Scenario Session data, as the data becomes available, and associate said data with work orders in my system-of-record.

To view a live demo of a “custom internal application”, complete with source code, see https://api-demo-ruby.scopear.com. Alternatively, view a video presentation demonstrating the sample application here.

Configuration

  1. For each physical device (aka the “Asset”) that work orders might pertain to in the system-of-record (.e.g. “Generator Model #1234”):

    1. The administrator creates at least one Scenario for the Asset using the Scope Create 2 application.

    2. The administrator creates a Scenario Catalog for the Asset using the Scope CMS web interface.

  2. The administrator configures the system-of-record to present a dynamically generated link whenever system-of-record users view work orders (aka the “Deeplink”). The link passes both the asset identifier and work order identifier as parameters:

    Code Block
    worklink://resource?action=fetch_catalog&asset=INSERT_ASSET_ID_HERE&work_order_id=INSERT_WORK_ORDER_ID_HERE

    See Using Deeplinks for more details on this.

  3. The administrator creates a custom internal application designed to receive Scope Webhooks.The administrator contacts Scope support to configure the customer’s Scope account to send Webhooks to said application

  4. A company administrator then creates a webhook that will trigger this application when the pertinent information in the Scope system is added or updated.

See Using WebHooks for more details on how to do this.

Scope Deeplinks are highly extensible and accept any number of external data parameters, not just work_order_id.

For more information, see “Using Webhooks”, “Using Deeplinks”, and “ServiceMax Configuration Guide”.

Procedure

Integration Workflow

...

  1. The user visits a Deeplink dynamically generated by the system-of-record (see “configuration” above).

  2. The WorkLink app launches and loads the Scenario Catalog identified by the Deeplink’s asset query param.

  3. The user selects a Scenario and begins a Scenario Session (causing all additional query parameters passed by the Deeplink to be stored as key-value pairs in a JSON hash on the ScenarioSession.externalData attribute, including and notably work_order_id).

  4. The customer’s custom application receives Webhook event notifications from the Scope platform that there is new data that the application is interested in.

  5. The customer’s custom application queries the Scope GraphQL API to retrieve the ScenarioSession data identified by the Webhook payload resource_id (see Session Data Sample Queries ).

  6. The customer’s custom application inspects the data retrieved for the value of ScenarioSession.externalData.work_order_id

  7. The customer’s custom application then associates the retrieved Scenario Session data with the work order in the customer’s system-of-record (using system-of-record APIs, or similar) using available APIs from the system-of-record.

The customer’s custom application receives ALL Webhook notifications, not just create and update events pertaining to ScenarioSessions. The application should inspect the Webhook’s payload event and data.resource_type values in order to determine the proper action to take.

...

I will use a custom internal application and the Scope GraphL API to retrieve Scenario Session data and store in a Data Store that my BI tools can read.

Procedure

...

Customer creates a custom application that:

...