If your data are used to train a Machine Learning model, chances are that a Data Scientist, a Data Engineer, or an ML engineer is going to stumble upon it! I know, for example, that Walmart categorically refuses to use AWS as Amazon is their direct competitor and doesn't want to risk having their data fall into the wrong hands. One solution to that data privacy problem could be to encrypt the data, but for typical encryption, that would mean that the whole training data needs to be encrypted. But if the Data Engineers own the key that encrypts the data, what would stop them from decrypting it when they receive new customer data? One way to go about it is to use Full Homomorphic Encryption (FHE). FHE means that when you encrypt data, it preserves addition and multiplication operations. For example, if E is the encryption function, we have: E(a + b) = E(a) + E(b) and E(a x b) = E(a) x E(b) In practice, it means that FHE preserves any polynomial transformation of the data. If a compu...