Confidential Machine Learning Inference  - Fact Sheet 

In the explainer fact sheet, you can find everything you need to know about our platform, built on Intel SGX.

We show you how remote attestation guarantees the privacy and security for both the model and the data owner.

And finally, you learn how to perform a single confidential inference.

If you are interested to see the tool in action, watch the demo video here, or fill the form and our team will reach out to you.

What others have to say about the confidential ML inference tool.

Luigi Patruno


"Recently I had the opportunity to test out decentriq’s confidential inference capability. The company provides a Python SDK that allows model publishers to upload their models. Clients use the same client to upload their data securely and retrieve predictions. Using the SDK was a seamless experience. Installation was quick and easy (think pip install). Then I uploaded a trained Tensorflow model in one process and performed inference in a separate process."