...
I will use a custom internal application, Deeplinks, Scenario Catalogs, Webhooks, and the Scope GraphL API to retrieve Scenario Session data, as the data becomes available, and associate said data with work orders in my system-of-record.
To view a live demo of a “custom internal application”, complete with source code, see https://api-demo-ruby.scopear.com. Alternatively, view a video presentation demonstrating the sample application here.
Configuration
For each physical device (aka the “Asset”) that work orders might pertain to in the system-of-record (.e.g. “Generator Model #1234”):
The administrator creates at least one Scenario for the Asset using the Scope Create 2 application.
The administrator creates a Scenario Catalog for the Asset using the Scope CMS web interface.
The administrator configures the system-of-record to present a dynamically generated link whenever system-of-record users view work orders (aka the “Deeplink”). The link passes both the asset identifier and work order identifier as parameters:
Code Block worklink://resource?action=fetch_catalog&asset=INSERT_ASSET_ID_HERE&work_order_id=INSERT_WORK_ORDER_ID_HERE
See Using Deeplinks for more details on this.
The administrator creates a custom internal application designed to receive Scope Webhooks.The administrator contacts Scope support to configure the customer’s Scope account to send Webhooks to said application
A company administrator then creates a webhook that will trigger this application when the pertinent information in the Scope system is added or updated.
See Using WebHooks for more details on how to do this.
Scope Deeplinks are highly extensible and accept any number of external data parameters, not just work_order_id.
For more information, see “Using Webhooks”, “Using Deeplinks”, and “ServiceMax Configuration Guide”.
Procedure
Integration Workflow
...
The user visits a Deeplink dynamically generated by the system-of-record (see “configuration” above).
The WorkLink app launches and loads the Scenario Catalog identified by the Deeplink’s
asset
query param.The user selects a Scenario and begins a Scenario Session (causing all additional query parameters passed by the Deeplink to be stored as key-value pairs in a JSON hash on the
ScenarioSession.externalData
attribute, including and notablywork_order_id
).The customer’s custom application receives Webhook event notifications from the Scope platform that there is new data that the application is interested in.
The customer’s custom application queries the Scope GraphQL API to retrieve the ScenarioSession data identified by the Webhook payload
resource_id
(see Session Data Sample Queries ).The customer’s custom application inspects the data retrieved for the value of
ScenarioSession.externalData.work_order_id
The customer’s custom application then associates the retrieved Scenario Session data with the work order in the customer’s system-of-record (using system-of-record APIs, or similar) using available APIs from the system-of-record.
The customer’s custom application receives ALL Webhook notifications, not just create
and update
events pertaining to ScenarioSessions
. The application should inspect the Webhook’s payload event
and data.resource_type
values in order to determine the proper action to take.
...
I will use a custom internal application and the Scope GraphL API to retrieve Scenario Session data and store in a Data Store that my BI tools can read.
Procedure
...
Customer creates a custom application that:
...