LLM Guardrails Integration
Swift Security creates an easy to way add LLM APPs need to be monitored by System Admin.
Last updated
Swift Security creates an easy to way add LLM APPs need to be monitored by System Admin.
Last updated
Step 1 :- Login the Swift Security with the valid credentials.
Step 2: Once logged in, navigate to the Integration Section and select the "LLM Guardrails" option.
Step 3: Click on "Add New Integration" and then proceed by clicking "Next".
Step 4: Fill in all the required details to onboard your Chatbot and proceed by clicking "Next".
Enter Application Name - provide your app name.
Application Description - provide application description.
Application Type (Internal, External) - referred to the tags for internal or external or both.
Tags - add tags which can be treated as a token from any prompt.
Input Scanners - click on checkbox, if scanning is only required when prompt is provided by the user.
Output Scanners - click on checkbox, if scanning is only required when output response is provided by the AI Model.
Configure -
Track Prompt - it refers to the prompt given by the user.
Track Completion - it refers to the output response given by the AI model.
When - “On Alert” means tracking of prompt only if alert occurs, “Always” means tracking of prompt always irrespective of alert generation.
What - “Full Detail” means tracking of entire prompt, “Summary” only means tracking of the partial prompt, let say first 100 characters from the entire prompt.
Step 5 :- Copy the Curl of Input and Output Scanners by clicking on the Copy Icon and then click "Finish".
This Input Curl and Output Curl should be used in your Chatbot Api which requires scanning in the prompt and output response section respectively.
Step 6 :- Access the details of the onboarded Chatbot by clicking on the three vertical dots and selecting "View Details".
Step 7 :- Set up the Curl copied in postman earlier to your Chatbot.