Integration Guide
This guide covers common integration patterns for connecting external systems to Contract Lucidity's API.
General Integration Pattern
All integrations follow the same high-level flow:
- Authenticate — Exchange client credentials for a JWT access token.
- Upload — POST the document with
multipart/form-data. - Poll — Check
pipeline_statuson the document until it reachescompleteorfailed. - Retrieve — Fetch the report, obligations, or other analysis data.
- Act — Push results to downstream systems (SharePoint, Teams, email, database, etc.).
Polling for Pipeline Status
After uploading a document, poll GET /api/v1/documents/{id} and check the pipeline_status field. The pipeline progresses through these stages in order:
queued → extracting → classifying → playbook → analyzing → embedding → storing → complete
If processing fails, the status will be failed and the response will include error_message with details.
Recommended polling interval: every 5 seconds. A typical document completes in 30-120 seconds depending on size and AI provider response times.
# Poll until complete or failed
while true; do
STATUS=$(curl -s https://<your-instance>/api/v1/documents/<doc-id> \
-H "Authorization: Bearer <access_token>" | jq -r '.pipeline_status')
echo "Status: $STATUS"
if [ "$STATUS" = "complete" ] || [ "$STATUS" = "failed" ]; then
break
fi
sleep 5
done
Webhook Support
Webhook notifications (push-based instead of polling) are planned but not yet available. For now, use polling as described above.
Power Platform Custom Connector
Microsoft Power Automate and Power Apps can integrate with Contract Lucidity using a custom connector.
Approach: HTTP Actions with Token Management
The simplest approach uses standard HTTP actions rather than a formal custom connector, since Power Automate handles the token exchange natively.
Step 1: Store Credentials
- In Power Automate, create a flow with a trigger (e.g., "When a file is created in SharePoint").
- Add an HTTP action to exchange credentials for a token:
| Setting | Value |
|---|---|
| Method | POST |
| URI | https://<your-instance>/api/v1/auth/token |
| Headers | Content-Type: application/json |
| Body | {"client_id": "<your-client-id>", "client_secret": "<your-client-secret>"} |
- Parse the response JSON to extract
access_token.
Store the client ID and secret in Azure Key Vault and reference them via the Key Vault connector rather than hardcoding them in the flow.
Step 2: Upload Document
Add another HTTP action:
| Setting | Value |
|---|---|
| Method | POST |
| URI | https://<your-instance>/api/v1/documents |
| Headers | Authorization: Bearer <access_token from Step 1> |
| Body | Form-data with file (from trigger) and project_id |
Step 3: Poll for Completion
Add a Do Until loop:
- HTTP GET
https://<your-instance>/api/v1/documents/<document-id> - Check if
pipeline_statusequalscompleteorfailed. - Add a Delay of 10 seconds between iterations.
- Set a timeout (e.g., 10 minutes) to avoid infinite loops.
Step 4: Retrieve and Act
Once complete:
- HTTP GET
/api/v1/documents/{id}/reportto retrieve the analysis. - Use the SharePoint connector to upload the report or update a list item.
- Use the Teams connector to post a notification.
- Use the Outlook connector to email stakeholders.
Example Flow Summary
Alternative: Formal Custom Connector
If you prefer a reusable custom connector:
- Export the OpenAPI spec from
https://<your-instance>/api/docs(Swagger UI) orhttps://<your-instance>/openapi.json. - In Power Platform, go to Data > Custom connectors > New > Import an OpenAPI file.
- Configure authentication as API Key with the header name
Authorizationand valueBearer <token>. - Test the connector and publish.
The formal custom connector approach requires manually managing token refresh since Power Platform's API Key auth type does not handle token expiry automatically. The HTTP action approach described above gives you full control over token lifecycle.
Postman
Postman is useful for exploring the API during development and testing.
Import the OpenAPI Spec
- Open Postman and click Import.
- Enter the URL:
https://<your-instance>/openapi.json - Postman will create a collection with all available endpoints.
Set Up Environment Variables
Create a Postman environment with these variables:
| Variable | Initial Value | Description |
|---|---|---|
base_url | https://<your-instance>/api/v1 | API base URL |
client_id | Your client ID | API credential |
client_secret | Your client secret | API credential |
access_token | (empty) | Populated automatically |
Auto-Fetch Token with Pre-Request Script
Add this pre-request script to the collection to automatically fetch a token before each request:
const tokenUrl = pm.environment.get("base_url") + "/auth/token";
const clientId = pm.environment.get("client_id");
const clientSecret = pm.environment.get("client_secret");
pm.sendRequest({
url: tokenUrl,
method: "POST",
header: { "Content-Type": "application/json" },
body: {
mode: "raw",
raw: JSON.stringify({
client_id: clientId,
client_secret: clientSecret
})
}
}, function (err, res) {
if (err) {
console.error(err);
} else {
const token = res.json().access_token;
pm.environment.set("access_token", token);
}
});
Then set the collection's Authorization to Bearer Token with the value {{access_token}}.
Python
A minimal Python integration using the requests library:
import requests
import time
BASE_URL = "https://<your-instance>/api/v1"
# Authenticate
auth = requests.post(f"{BASE_URL}/auth/token", json={
"client_id": "your-client-id",
"client_secret": "your-client-secret"
})
token = auth.json()["access_token"]
headers = {"Authorization": f"Bearer {token}"}
# Upload document
with open("contract.pdf", "rb") as f:
upload = requests.post(
f"{BASE_URL}/documents",
headers=headers,
files={"file": f},
data={"project_id": "your-project-uuid"}
)
doc_id = upload.json()["id"]
# Poll for completion
while True:
doc = requests.get(f"{BASE_URL}/documents/{doc_id}", headers=headers).json()
status = doc["pipeline_status"]
print(f"Status: {status}")
if status in ("complete", "failed"):
break
time.sleep(5)
# Retrieve report
if status == "complete":
report = requests.get(
f"{BASE_URL}/documents/{doc_id}/report",
headers=headers
).json()
print(f"Risk: {report['overall_risk']}")
print(f"Summary: {report['executive_summary']}")
Best Practices
| Area | Recommendation |
|---|---|
| Token caching | Cache the access token and reuse it for its full 24-hour lifetime. Do not request a new token for every API call. |
| Error handling | Always check HTTP status codes. Retry on 500 with exponential backoff. Do not retry on 401 — re-authenticate instead. |
| Timeouts | Set HTTP timeouts on all requests (30 seconds for most endpoints, 60 seconds for document uploads). |
| Bulk uploads | Upload documents sequentially, not in parallel. The worker queue handles concurrency internally. |
| Idempotency | Document uploads are not idempotent. Uploading the same file twice creates two separate documents. Track upload state in your integration to avoid duplicates. |
| Security | Never log or store access tokens in plain text. Use secret managers for client credentials. |