In the explainer fact sheet, you can find everything you need to know about our platform, built on Intel SGX.
If you are interested to see the tool in action, watch the demo video here, or fill the form and our team will reach out to you.
What others have to say about the confidential ML inference tool.
Luigi Patruno
www.mlinproduction.com
"Recently I had the opportunity to test out decentriq’s confidential inference capability. The company provides a Python SDK that allows model publishers to upload their models. Clients use the same client to upload their data securely and retrieve predictions. Using the SDK was a seamless experience. Installation was quick and easy (think pip install). Then I uploaded a trained Tensorflow model in one process and performed inference in a separate process."