TransNet: Training Privacy-Preserving Neural Network over Transformed Layer

The accuracy of neural network can be improved by training over multi-participants' pooled dataset, but privacy problem of sharing sensitive data obstructs this collaborative learning. To solve this contradiction, we propose TransNet, a novel solution for privacy-preserving collaborative neural...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Proceedings of the VLDB Endowment 2020-07, Vol.13 (11), p.1849-1862
Hauptverfasser: He, Qijian, Yang, Wei, Chen, Bingren, Geng, Yangyang, Huang, Liusheng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The accuracy of neural network can be improved by training over multi-participants' pooled dataset, but privacy problem of sharing sensitive data obstructs this collaborative learning. To solve this contradiction, we propose TransNet, a novel solution for privacy-preserving collaborative neural network, whose main idea is to add a transformed layer to the neural network. It has the advantage of lower computation and communication complexity than previous secure multi-party computation based and homomorphic encryption based schemes, and has the superiority of supporting arbitrarily partitioned dataset compared to previous differential privacy based and stochastic gradient descent based schemes, which support horizontally partitioned dataset only. TransNet is trained by a server which pools the transformed data, but has no special security requirement on the training server. We evaluate TransNet's performance over four datasets using different neural network algorithms. Experimental results demonstrate that TransNet is not affected by the number of participants, and trains as quickly as the original neural network does. With proper variables, TransNet gets close accuracy to the baseline which trains over pooled original dataset.
ISSN:2150-8097
2150-8097
DOI:10.14778/3407790.3407794