Federated learning with influence function
Federated learning with ICML 2017 Best paper "influence function" This code is compatilble with Tensorflow 2.x which can use eagar execution (from previous TensorFlow 1.x)
below pictures are the most harmful data for federated learning
Step 0. Build model
model = tf.keras.models.Sequential()
model.add(tf.keras.layers.Input(shape=(784,)))
model.add(tf.keras.layers.Dense(1, activation='sigmoid', use_bias=True))
model.compile(loss='binary_crossentropy', metrics=['accuracy'])
model.fit(x, y)
Step 1. you should calculate s_test (inversed hessian vector product)
TEST_INDEX = 5
x_test_tf = tf.convert_to_tensor(x_test[TEST_INDEX: TEST_INDEX+1])
y_test_tf = tf.convert_to_tensor(y_test[TEST_INDEX: TEST_INDEX+1])
test_grad_my = grad_z(x_test_tf, y_test_tf, f=model)
Step 2. you also should calculate gradient of specific train data (grad z of train data i)
s_test_my = get_inv_hessian_vector_product(x_train_tf, y_train_tf, test_grad_my, model,
scale=10,
n_recursion=1000,
verbose=False)
Step 3. you should multipliy gradient of train with s_test (for each train data)
for i in range(train_sample_num):
# Get train grad
train_grad = grad_z(x_train_tf[i: i+1], y_train[i: i+1], model, for_train=True)
loss_diff_approx[i] = multiply_for_influe(train_grad, s_test_my) / train_sample_num
-------------------------------------------------
tensorflow 2.x
numpy 1.xx (we recommand to use 1.18.5 which we used in our paper)
sklearn 0.23.x (only used in simulation)
This code was inspired by https://github.com/nayopu/influence_function_with_lissa
and ICML 2017 best paper Understanding Black-box Predictions via Influence Functions (https://arxiv.org/pdf/1703.04730)