resources_header

Confidential Machine Learning Inference  - Fact Sheet 

In the explainer fact sheet, you can find everything you need to know about our platform, built on Intel SGX.


We show you how remote attestation guarantees the privacy and security for both the model and the data owner.

And finally, you learn how to perform a single confidential inference.

If you are interested to see the tool in action, watch the demo video here, or book your personal demo below.

Book Your Demo

What others have to say about the confidential ML inference tool.

Luigi Patruno
www.mlinproduction.com

luigi-2

"Recently I had the opportunity to test out decentriq’s confidential inference capability. The company provides a Python SDK that allows model publishers to upload their models. Clients use the same client to upload their data securely and retrieve predictions. Using the SDK was a seamless experience. Installation was quick and easy (think pip install). Then I uploaded a trained Tensorflow model in one process and performed inference in a separate process."