Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Data read successfully
started running:
val_data size =  20
train_data size =  1780
Valiation embeddings calculated
Train embeddings calculated
/home/kiranpurohit/miniconda3/envs/llm/lib/python3.9/site-packages/sklearn/cluster/_kmeans.py:1416: FutureWarning: The default value of `n_init` will change from 10 to 'auto' in 1.4. Set the value of `n_init` explicitly to suppress the warning
  super()._check_params_vs_input(X, default_n_init=10)

predicting:   0%|          | 0/10 [00:00<?, ?it/s]
predicting:  10%|█         | 1/10 [00:32<04:48, 32.04s/it]
predicting:  20%|██        | 2/10 [01:05<04:23, 32.95s/it]
predicting:  30%|███       | 3/10 [01:38<03:51, 33.10s/it]
predicting:  40%|████      | 4/10 [02:08<03:11, 31.88s/it]
predicting:  50%|█████     | 5/10 [02:37<02:33, 30.78s/it]
predicting:  60%|██████    | 6/10 [03:13<02:09, 32.32s/it]
predicting:  70%|███████   | 7/10 [03:44<01:36, 32.16s/it]
predicting:  80%|████████  | 8/10 [04:22<01:07, 33.96s/it]
predicting:  90%|█████████ | 9/10 [04:50<00:32, 32.00s/it]
predicting: 100%|██████████| 10/10 [05:27<00:00, 33.55s/it]
predicting: 100%|██████████| 10/10 [05:27<00:00, 32.74s/it]

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:33<02:15, 33.78s/it]
predicting:  40%|████      | 2/5 [01:06<01:39, 33.31s/it]
predicting:  60%|██████    | 3/5 [01:41<01:08, 34.16s/it]
predicting:  80%|████████  | 4/5 [02:11<00:32, 32.53s/it]
predicting: 100%|██████████| 5/5 [02:46<00:00, 33.17s/it]
predicting: 100%|██████████| 5/5 [02:46<00:00, 33.25s/it]

********* LLM LOSS ON U FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.15, 0.4, 0.2, 0.35, 0.35, 0.2, 0.15, 0.3, 0.35, 0.3]]
AVG_LLM_loss_on_VAL_data  [0.275]
MIN_LLM_loss_on_VAL_data  [0.15]
MAX_LLM_loss_on_VAL_data  [0.4]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3]]
AVG_LLM_loss_on_VAL_data  [0.26]
MIN_LLM_loss_on_VAL_data  [0.2]
MAX_LLM_loss_on_VAL_data  [0.3]

overlaps  [[0, 0, 0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0]]

AVG_overlap  [0.04444444444444444]
MIN_overlap  [0.0]
MAX_overlap  [0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [ 6.13398221e-15 -7.32747196e-15 -1.33226763e-14 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 3.31713813e-01  2.24711544e-01  1.53573440e-01  9.72027148e-02
  2.77713168e-01  2.70659343e-01  6.79425198e-02  4.07317648e-01
 -1.01728853e-01  1.30165444e-01 -7.04159776e-03  9.27939078e-02
  1.28214019e-01  2.05474696e-01  8.86433456e-02  1.30355604e-01
 -9.60269869e-02  2.56005415e-01  1.74120812e-01  3.57588083e-02
  1.81055304e-01  6.42766480e-01  6.55184490e-01  4.07856012e-01
  2.87600030e-01  3.87540612e-01  4.39652038e-01  2.42606947e-01
 -1.36763252e-03  8.29425058e-01  2.71671833e-01  4.17507821e-01
  3.21340427e-01  4.60873642e-01  4.86862322e-01  5.11514577e-01
  3.00788193e-01  1.90440662e-01  5.43035561e-01  2.55952719e-01
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  5.91545311e-01  3.38973090e-01  3.12048814e-01  5.26575858e-01
  2.14637225e-01  3.69621732e-01  2.69349650e-01  4.79935533e-01
  1.27737734e-01  8.16182621e-02  2.76865791e-01  3.54660792e-01
  4.59152239e-01  4.39242840e-01  2.98041324e-01  3.78115374e-01
  5.21017924e-02  4.47757515e-01  2.12428232e-01  6.93554982e-01
  5.24690158e-01  3.50233939e-01  3.99940918e-01  1.06693570e-01
 -2.09571006e-02  8.26679478e-02  2.75716371e-01  6.09104634e-01
 -3.02329400e-01  4.31417333e-01  3.50023392e-01  4.75297318e-01
  5.43725592e-01  3.54916695e-01  4.03118951e-01  1.00980837e+00
  4.14751936e-01  1.63207859e-01  5.00754705e-01  1.57470285e-01
 -6.53227678e-03 -1.35558023e-02  1.06520996e-02  1.52322869e-01
  5.16512609e-01  1.13304861e-01  1.57068309e-01  1.07409092e-01
  3.73194785e-01  4.50733767e-02  6.50102390e-01  6.33303149e-02
 -6.15312353e-02  2.53514210e-01  8.22987537e-02  5.21589912e-01
  3.20875739e-01  9.61154070e-03  3.71206334e-01  3.94021453e-01
  1.67998823e-01  6.44556358e-02  1.42515970e-01  3.41822227e-02
  7.32953674e-02  1.80483654e-01  2.16117257e-01  4.97271723e-02
  8.16811992e-02  2.74081387e-01  5.89584494e-02  3.01861363e-01
  1.45132399e-01  1.94541750e-01  4.15457156e-01  2.74847678e-01
  1.84347449e-01 -1.22814177e-01  1.89865497e-01 -2.75550208e-02
  4.62516545e-01  4.72782717e-01  4.27771530e-01 -8.41415873e-02
  3.70853746e-01  5.42368229e-01  3.56742406e-01  5.74722565e-01
 -1.52332433e-02  2.73433163e-01  3.40957686e-01  1.42768882e-01
  4.15936007e-01  2.11006632e-01 -1.43518296e-01  4.61376012e-01
  2.96644493e-01 -1.32441451e-01  7.87396505e-01  1.40111432e-01
  5.79310165e-01  3.55177095e-01  6.59945436e-01  6.64725732e-01
  3.42280875e-01  3.97793105e-01  1.99999188e-01  6.97695999e-01
  5.32744984e-02  9.57836136e-04  2.03733362e-01  4.11952350e-01
  4.60639630e-01  3.29298417e-01 -1.32358545e-02  5.65264025e-01
  2.09305282e-01  1.33694770e-01  4.10629926e-01  2.65472503e-01
  3.35431075e-01  5.46763410e-01  5.29496740e-01  2.35901169e-02
  7.15593564e-01 -1.12298096e-01  1.76047254e-01  5.06818096e-01
 -1.29899579e-01  3.82184846e-01  2.04919644e-01  5.33797976e-01
  4.82195848e-01  2.06584159e-01  8.88180868e-02  6.31979952e-01
  2.71928719e-01 -9.88111826e-02  2.78040085e-01  3.52186196e-01]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.24909239  0.44041928  0.20682323  0.25918919  0.1174084   0.26437479
  0.0611019   0.26571118  0.31352498  0.23801949  0.23678748  0.21588741
  0.14795906  0.33088089  0.20345093  0.37557346  0.39366814  0.13815208
  0.36665215  0.1634586   0.08437592  0.25809871  0.20132481  0.11979049
  0.24560155  0.2796985   0.20898112  0.15925417  0.19771088  0.32279558
  0.19744102  0.2085494  -0.17090558  0.20138711  0.22533442  0.14817035
  0.34020598  0.00166531  0.39186516  0.1270792   0.3683471   0.19812329
 -0.17666154  0.33975405  0.27252853  0.3864736   0.19214057  0.47665135
  0.33871299  0.17750213  0.18633049  0.20521799 -0.17069586 -0.04504478
  0.29848958  0.45881241  0.21475553  0.29237792  0.18239566  0.66172916
  0.03722865  0.40673307  0.34944459  0.56701245  0.24922604 -0.14683813
  0.24692484  0.40488878  0.44099595  0.39347484  0.09387135  0.11926312
  0.73808576  0.03520865  0.42561265  0.12765816  0.25947391  0.15283659
  0.33428151  0.60562464 -0.0188495   0.1336956   0.55960758  0.23631164
  0.0833085  -0.08360625  0.84479739  0.43194508  0.36784419  0.12506075
 -0.02997447  0.02368892  0.5499502   0.37353321 -0.04279133  0.3484301
  0.46987648  0.3212249   0.50859981  0.71644962]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:34<00:00, 34.85s/it]
predicting: 100%|██████████| 1/1 [00:34<00:00, 34.85s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:29<01:59, 29.85s/it]
predicting:  40%|████      | 2/5 [01:03<01:36, 32.16s/it]
predicting:  60%|██████    | 3/5 [01:37<01:05, 32.77s/it]
predicting:  80%|████████  | 4/5 [02:13<00:34, 34.39s/it]
predicting: 100%|██████████| 5/5 [02:44<00:00, 33.00s/it]
predicting: 100%|██████████| 5/5 [02:44<00:00, 32.91s/it]

***********************************
S_worst_ind  1

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.31713813e-01  2.24711544e-01  1.53573440e-01  9.72027148e-02
  2.77713168e-01  2.70659343e-01  6.79425198e-02  4.07317648e-01
 -1.01728853e-01  1.30165444e-01 -7.04159776e-03  9.27939078e-02
  1.28214019e-01  2.05474696e-01  8.86433456e-02  1.30355604e-01
 -9.60269869e-02  2.56005415e-01  1.74120812e-01  3.57588083e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  5.91545311e-01  3.38973090e-01  3.12048814e-01  5.26575858e-01
  2.14637225e-01  3.69621732e-01  2.69349650e-01  4.79935533e-01
  1.27737734e-01  8.16182621e-02  2.76865791e-01  3.54660792e-01
  4.59152239e-01  4.39242840e-01  2.98041324e-01  3.78115374e-01
  5.21017924e-02  4.47757515e-01  2.12428232e-01  6.93554982e-01
  5.24690158e-01  3.50233939e-01  3.99940918e-01  1.06693570e-01
 -2.09571006e-02  8.26679478e-02  2.75716371e-01  6.09104634e-01
 -3.02329400e-01  4.31417333e-01  3.50023392e-01  4.75297318e-01
  5.43725592e-01  3.54916695e-01  4.03118951e-01  1.00980837e+00
  4.14751936e-01  1.63207859e-01  5.00754705e-01  1.57470285e-01
 -6.53227678e-03 -1.35558023e-02  1.06520996e-02  1.52322869e-01
  5.16512609e-01  1.13304861e-01  1.57068309e-01  1.07409092e-01
  3.73194785e-01  4.50733767e-02  6.50102390e-01  6.33303149e-02
 -6.15312353e-02  2.53514210e-01  8.22987537e-02  5.21589912e-01
  3.20875739e-01  9.61154070e-03  3.71206334e-01  3.94021453e-01
  1.67998823e-01  6.44556358e-02  1.42515970e-01  3.41822227e-02
  7.32953674e-02  1.80483654e-01  2.16117257e-01  4.97271723e-02
  8.16811992e-02  2.74081387e-01  5.89584494e-02  3.01861363e-01
  1.45132399e-01  1.94541750e-01  4.15457156e-01  2.74847678e-01
  1.84347449e-01 -1.22814177e-01  1.89865497e-01 -2.75550208e-02
  4.62516545e-01  4.72782717e-01  4.27771530e-01 -8.41415873e-02
  3.70853746e-01  5.42368229e-01  3.56742406e-01  5.74722565e-01
 -1.52332433e-02  2.73433163e-01  3.40957686e-01  1.42768882e-01
  4.15936007e-01  2.11006632e-01 -1.43518296e-01  4.61376012e-01
  2.96644493e-01 -1.32441451e-01  7.87396505e-01  1.40111432e-01
  5.79310165e-01  3.55177095e-01  6.59945436e-01  6.64725732e-01
  3.42280875e-01  3.97793105e-01  1.99999188e-01  6.97695999e-01
  5.32744984e-02  9.57836136e-04  2.03733362e-01  4.11952350e-01
  4.60639630e-01  3.29298417e-01 -1.32358545e-02  5.65264025e-01
  2.09305282e-01  1.33694770e-01  4.10629926e-01  2.65472503e-01
  3.35431075e-01  5.46763410e-01  5.29496740e-01  2.35901169e-02
  7.15593564e-01 -1.12298096e-01  1.76047254e-01  5.06818096e-01
 -1.29899579e-01  3.82184846e-01  2.04919644e-01  5.33797976e-01
  4.82195848e-01  2.06584159e-01  8.88180868e-02  6.31979952e-01
  2.71928719e-01 -9.88111826e-02  2.78040085e-01  3.52186196e-01
 -3.52748063e+00 -3.60851356e+00 -3.62407721e+00 -3.29889254e+00
 -3.22766438e+00 -3.21018646e+00 -3.33416909e+00 -3.38390104e+00
 -2.90154736e+00 -3.54329354e+00 -2.95660399e+00 -3.41033031e+00
 -3.68573090e+00 -3.50472145e+00 -3.40841458e+00 -3.19878933e+00
 -3.53454670e+00 -3.17689338e+00 -3.18276696e+00 -2.53148907e+00]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.72286547e-02  4.06733072e-01  3.49444591e-01  5.67012451e-01
  2.49226037e-01 -1.46838130e-01  2.46924842e-01  4.04888784e-01
  4.40995946e-01  3.93474844e-01  9.38713515e-02  1.19263122e-01
  7.38085764e-01  3.52086534e-02  4.25612647e-01  1.27658163e-01
  2.59473906e-01  1.52836586e-01  3.34281507e-01  6.05624643e-01
 -1.88494979e-02  1.33695597e-01  5.59607575e-01  2.36311638e-01
  8.33085001e-02 -8.36062454e-02  8.44797385e-01  4.31945075e-01
  3.67844188e-01  1.25060752e-01 -2.99744680e-02  2.36889236e-02
  5.49950204e-01  3.73533210e-01 -4.27913326e-02  3.48430098e-01
  4.69876481e-01  3.21224900e-01  5.08599814e-01  7.16449617e-01
  1.81055304e-01  6.42766480e-01  6.55184490e-01  4.07856012e-01
  2.87600030e-01  3.87540612e-01  4.39652038e-01  2.42606947e-01
 -1.36763252e-03  8.29425058e-01  2.71671833e-01  4.17507821e-01
  3.21340427e-01  4.60873642e-01  4.86862322e-01  5.11514577e-01
  3.00788193e-01  1.90440662e-01  5.43035561e-01  2.55952719e-01
  5.90003231e+00  6.02848613e+00  6.32943078e+00  5.71368934e+00
  5.92470969e+00  5.60946500e+00  6.37595636e+00  6.01073741e+00
  5.53839862e+00  5.75724296e+00  5.50371601e+00  5.77752396e+00
  6.06124241e+00  5.91666354e+00  5.85885116e+00  5.34337006e+00
  6.27782082e+00  5.52618922e+00  5.99273012e+00  5.33700818e+00
  1.69393118e+00  1.63373773e+00  1.49255662e+00  1.52335234e+00
  1.42425510e+00  1.58317986e+00  1.48247703e+00  1.58546302e+00
  1.31885502e+00  1.61410071e+00  1.35995021e+00  1.57295629e+00
  1.37833858e+00  1.43659425e+00  1.60929092e+00  1.37433170e+00
  1.60832412e+00  1.36814606e+00  1.45940009e+00  1.18685871e+00]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683]]

overlaps  [[0, 0, 0, 0, 1, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668]
MIN_overlap  [0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [ 1.18238752e-14 -4.44089210e-15 -4.44089210e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.31662034e-01  2.24762941e-01  1.53570774e-01  9.71564511e-02
  2.77711232e-01  2.70647455e-01  6.79401583e-02  4.07256621e-01
 -1.01753595e-01  1.30048613e-01 -7.00590053e-03  9.27321722e-02
  1.28296725e-01  2.05512763e-01  8.86499364e-02  1.30381078e-01
 -9.59932979e-02  2.56016986e-01  1.74236395e-01  3.57497806e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  5.91545311e-01  3.38973090e-01  3.12048814e-01  5.26575858e-01
  2.14637225e-01  3.69621732e-01  2.69349650e-01  4.79935533e-01
  1.27737734e-01  8.16182621e-02  2.76865791e-01  3.54660792e-01
  4.59152239e-01  4.39242840e-01  2.98041324e-01  3.78115374e-01
  5.21017924e-02  4.47757515e-01  2.12428232e-01  6.93554982e-01
  5.24690158e-01  3.50233939e-01  3.99940918e-01  1.06693570e-01
 -2.09571006e-02  8.26679478e-02  2.75716371e-01  6.09104634e-01
 -3.02329400e-01  4.31417333e-01  3.50023392e-01  4.75297318e-01
  5.43725592e-01  3.54916695e-01  4.03118951e-01  1.00980837e+00
  4.14751936e-01  1.63207859e-01  5.00754705e-01  1.57470285e-01
  3.41162511e-02  3.05500447e-02  1.74018936e-01  1.56282691e-01
  3.69271847e-01  1.43807533e-01  2.18325141e-01  1.08592899e-01
  3.12318285e-01  2.18873972e-02  4.07611686e-01  4.27397252e-02
  2.44709262e-01  3.51627792e-01  2.50231883e-02  5.45114994e-01
  2.35352468e-01  1.42001143e-01  2.25808339e-01  2.60665278e-01
  1.38569732e-01  7.91891857e-02  1.68463577e-01  2.92648618e-02
  1.11518649e-01  1.75721238e-01  2.17162020e-01  3.68535304e-02
  8.97585529e-02  2.52476140e-01  5.45928598e-02  3.01605686e-01
  1.54805774e-01  1.60523822e-01  3.97186132e-01  2.77908061e-01
  1.64921020e-01 -9.78485534e-02  1.82934488e-01  5.81438747e-03
  4.62516545e-01  4.72782717e-01  4.27771530e-01 -8.41415873e-02
  3.70853746e-01  5.42368229e-01  3.56742406e-01  5.74722565e-01
 -1.52332433e-02  2.73433163e-01  3.40957686e-01  1.42768882e-01
  4.15936007e-01  2.11006632e-01 -1.43518296e-01  4.61376012e-01
  2.96644493e-01 -1.32441451e-01  7.87396505e-01  1.40111432e-01
  5.79310165e-01  3.55177095e-01  6.59945436e-01  6.64725732e-01
  3.42280875e-01  3.97793105e-01  1.99999188e-01  6.97695999e-01
  5.32744984e-02  9.57836136e-04  2.03733362e-01  4.11952350e-01
  4.60639630e-01  3.29298417e-01 -1.32358545e-02  5.65264025e-01
  2.09305282e-01  1.33694770e-01  4.10629926e-01  2.65472503e-01
  3.35431075e-01  5.46763410e-01  5.29496740e-01  2.35901169e-02
  7.15593564e-01 -1.12298096e-01  1.76047254e-01  5.06818096e-01
 -1.29899579e-01  3.82184846e-01  2.04919644e-01  5.33797976e-01
  4.82195848e-01  2.06584159e-01  8.88180868e-02  6.31979952e-01
  2.71928719e-01 -9.88111826e-02  2.78040085e-01  3.52186196e-01
  2.47727082e-01  2.46337029e-01  1.98342954e-01  2.54128599e-01
  3.79235632e-01  4.22010209e-01  2.91200142e-01  3.52715658e-01
  4.75028449e-01  2.35653660e-01  4.95924690e-01  7.61112392e-02
  1.76115754e-01  3.95324434e-01  4.00264327e-02  3.00307697e-01
  1.50715863e-01  4.53475508e-01  3.19745134e-01  3.38458254e-01]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.36807271653053697, 0.3341363616556508, 0.39820873737209206]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.01788096  0.06817307 -0.00204272  0.32658549  0.426164    0.0366264
  0.15928416  0.2742785   0.46731835  0.03512956  0.13233615  0.1445458
  0.20967096  0.04176161  0.16846846  0.16925892  0.23687094  0.1543266
  0.18668013  0.59743503  0.26545198  0.31194516  0.59128292  0.42947651
  0.30854591  0.15622641  0.62989501  0.3396906   0.36965115  0.33238667
  0.19693649  0.35556639  0.57147262  0.40478595  0.33311399  0.42675501
  0.49347659  0.39748057  0.42769538  0.56692183  0.21167199  0.34604282
  0.42341412  0.27581717  0.34670084  0.32624518  0.36055807  0.28409184
  0.10169843  0.44337599  0.30236686  0.37822921  0.39090345  0.45273268
  0.42980351  0.47322067  0.39619006  0.25734704  0.38545052  0.23341002
 -0.06919591  0.26957772  0.31167343  0.0698592   0.31527973 -0.03707563
  0.23159227  0.1190233   0.25450053  0.09967396  0.30686698  0.06859913
  0.233072   -0.06059981  0.22177137  0.21506979  0.49731636 -0.03237954
  0.59556293  0.24872247  0.24486684  0.33740243  0.29904185  0.2855686
  0.20520222  0.3490476   0.31612638  0.3458053   0.24013352  0.30681342
  0.03449013  0.13705262  0.10922053  0.08876249  0.24117388  0.31226539
  0.3131631   0.05921712  0.41611602  0.15319395]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574], [0.2699766326061984, 0.43664479101758474, 0.441828670924199, 0.2941347128912874, 0.3541301283812711]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:30<00:00, 30.98s/it]
predicting: 100%|██████████| 1/1 [00:30<00:00, 30.98s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:33<02:14, 33.63s/it]
predicting:  40%|████      | 2/5 [01:00<01:29, 29.85s/it]
predicting:  60%|██████    | 3/5 [01:37<01:05, 32.99s/it]
predicting:  80%|████████  | 4/5 [02:11<00:33, 33.30s/it]
predicting: 100%|██████████| 5/5 [02:42<00:00, 32.58s/it]
predicting: 100%|██████████| 5/5 [02:42<00:00, 32.53s/it]

***********************************
S_worst_ind  7

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996, 0.24499999999999997]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35, 0.35]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25], [0.35, 0.3, 0.3, 0.25, 0.2]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28, 0.27999999999999997]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2, 0.2]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4, 0.35]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1]

approximation 
 [ 0.33166203  0.22476294  0.15357077  0.09715645  0.27771123  0.27064746
  0.06794016  0.40725662 -0.1017536   0.13004861 -0.0070059   0.09273217
  0.12829673  0.20551276  0.08864994  0.13038108 -0.0959933   0.25601699
  0.17423639  0.03574978  0.05669492  0.23865117  0.30983121  0.05799554
  0.09385079  0.12771378  0.27306829  0.14448473  0.12764528  0.25251942
  0.27459064  0.15846121  0.16533228  0.21105674  0.29745928  0.42179826
  0.39303333  0.06537089  0.14958501  0.10881832  0.59154531  0.33897309
  0.31204881  0.52657586  0.21463722  0.36962173  0.26934965  0.47993553
  0.12773773  0.08161826  0.27686579  0.35466079  0.45915224  0.43924284
  0.29804132  0.37811537  0.05210179  0.44775751  0.21242823  0.69355498
  0.52469016  0.35023394  0.39994092  0.10669357 -0.0209571   0.08266795
  0.27571637  0.60910463 -0.3023294   0.43141733  0.35002339  0.47529732
  0.54372559  0.3549167   0.40311895  1.00980837  0.41475194  0.16320786
  0.5007547   0.15747028  0.03411625  0.03055004  0.17401894  0.15628269
  0.36927185  0.14380753  0.21832514  0.1085929   0.31231828  0.0218874
  0.40761169  0.04273973  0.24470926  0.35162779  0.02502319  0.54511499
  0.23535247  0.14200114  0.22580834  0.26066528  0.13856973  0.07918919
  0.16846358  0.02926486  0.11151865  0.17572124  0.21716202  0.03685353
  0.08975855  0.25247614  0.05459286  0.30160569  0.15480577  0.16052382
  0.39718613  0.27790806  0.16492102 -0.09784855  0.18293449  0.00581439
  0.46251654  0.47278272  0.42777153 -0.08414159  0.37085375  0.54236823
  0.35674241  0.57472257 -0.01523324  0.27343316  0.34095769  0.14276888
  0.41593601  0.21100663 -0.1435183   0.46137601  0.29664449 -0.13244145
  0.78739651  0.14011143  0.33543107  0.54676341  0.52949674  0.02359012
  0.71559356 -0.1122981   0.17604725  0.5068181  -0.12989958  0.38218485
  0.20491964  0.53379798  0.48219585  0.20658416  0.08881809  0.63197995
  0.27192872 -0.09881118  0.27804009  0.3521862   0.24772708  0.24633703
  0.19834295  0.2541286   0.37923563  0.42201021  0.29120014  0.35271566
  0.47502845  0.23565366  0.49592469  0.07611124  0.17611575  0.39532443
  0.04002643  0.3003077   0.15071586  0.45347551  0.31974513  0.33845825
 -0.42571918 -0.40528079 -0.40329582 -0.39100976 -0.40616303 -0.36284262
 -0.39405625 -0.41930936 -0.3397974  -0.41130068 -0.38297126 -0.39721808
 -0.39029932 -0.40288624 -0.42983108 -0.33476413 -0.40774279 -0.38244107
 -0.36718493 -0.37328542]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.3341363616556508, 0.39820873737209206, 0.5413699612501198]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 2.11671993e-01  3.46042822e-01  4.23414122e-01  2.75817174e-01
  3.46700845e-01  3.26245177e-01  3.60558071e-01  2.84091841e-01
  1.01698432e-01  4.43375991e-01  3.02366862e-01  3.78229214e-01
  3.90903453e-01  4.52732683e-01  4.29803505e-01  4.73220669e-01
  3.96190063e-01  2.57347042e-01  3.85450525e-01  2.33410019e-01
  5.79310165e-01  3.55177095e-01  6.59945436e-01  6.64725732e-01
  3.42280875e-01  3.97793105e-01  1.99999188e-01  6.97695999e-01
  5.32744984e-02  9.57836136e-04  2.03733362e-01  4.11952350e-01
  4.60639630e-01  3.29298417e-01 -1.32358545e-02  5.65264025e-01
  2.09305282e-01  1.33694770e-01  4.10629926e-01  2.65472503e-01
  2.65451981e-01  3.11945160e-01  5.91282919e-01  4.29476512e-01
  3.08545913e-01  1.56226407e-01  6.29895011e-01  3.39690602e-01
  3.69651152e-01  3.32386670e-01  1.96936489e-01  3.55566385e-01
  5.71472621e-01  4.04785952e-01  3.33113987e-01  4.26755008e-01
  4.93476586e-01  3.97480573e-01  4.27695381e-01  5.66921830e-01
  5.47734963e-01  5.50872008e-01  5.59891485e-01  4.98506121e-01
  5.39609895e-01  5.00860616e-01  5.15376225e-01  4.86376235e-01
  4.75631353e-01  5.45608893e-01  5.27641992e-01  5.69552407e-01
  5.41829926e-01  5.36993817e-01  5.64935434e-01  5.36935543e-01
  6.05485396e-01  4.65476328e-01  4.86720764e-01  4.15825571e-01
  1.08170720e+00  1.10644112e+00  1.05306328e+00  1.14544446e+00
  1.13040319e+00  1.11013466e+00  1.06853138e+00  1.13034591e+00
  1.08766799e+00  1.07948072e+00  1.18454251e+00  1.02458508e+00
  1.09120668e+00  1.09012226e+00  1.06913169e+00  9.84167024e-01
  1.19258369e+00  1.06580639e+00  1.11525957e+00  1.00636120e+00]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683], [0.441828670924199, 0.38406726017293075, 0.40305602082422726, 0.5268244446936265, 0.8924325973686938]]

overlaps  [[0, 0, 0, 0, 1, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0, 0, 0, 0], [0, 1, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.06666666666666668]
MIN_overlap  [0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [ 5.19029264e-15 -1.11022302e-15 -9.76996262e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1]

approximation 
 [ 0.33171381  0.22471154  0.15357344  0.09720271  0.27771317  0.27065934
  0.06794252  0.40731765 -0.10172885  0.13016544 -0.0070416   0.09279391
  0.12821402  0.2054747   0.08864335  0.1303556  -0.09602699  0.25600542
  0.17412081  0.03575881  0.05669492  0.23865117  0.30983121  0.05799554
  0.09385079  0.12771378  0.27306829  0.14448473  0.12764528  0.25251942
  0.27459064  0.15846121  0.16533228  0.21105674  0.29745928  0.42179826
  0.39303333  0.06537089  0.14958501  0.10881832  0.59154531  0.33897309
  0.31204881  0.52657586  0.21463722  0.36962173  0.26934965  0.47993553
  0.12773773  0.08161826  0.27686579  0.35466079  0.45915224  0.43924284
  0.29804132  0.37811537  0.05210179  0.44775751  0.21242823  0.69355498
  0.52469016  0.35023394  0.39994092  0.10669357 -0.0209571   0.08266795
  0.27571637  0.60910463 -0.3023294   0.43141733  0.35002339  0.47529732
  0.54372559  0.3549167   0.40311895  1.00980837  0.41475194  0.16320786
  0.5007547   0.15747028  0.03408627  0.03051751  0.17389843  0.15627977
  0.36938046  0.14378503  0.21827995  0.10859203  0.31236319  0.0219045
  0.40779056  0.04275491  0.24448336  0.35155542  0.02506544  0.54509764
  0.23541555  0.14190349  0.22591559  0.26076365  0.16799882  0.06445564
  0.14251597  0.03418222  0.07329537  0.18048365  0.21611726  0.04972717
  0.0816812   0.27408139  0.05895845  0.30186136  0.1451324   0.19454175
  0.41545716  0.27484768  0.18434745 -0.12281418  0.1898655  -0.02755502
  0.46251654  0.47278272  0.42777153 -0.08414159  0.37085375  0.54236823
  0.35674241  0.57472257 -0.01523324  0.27343316  0.34095769  0.14276888
  0.41593601  0.21100663 -0.1435183   0.46137601  0.29664449 -0.13244145
  0.78739651  0.14011143  0.33856645  0.53377967  0.5257301   0.00633802
  0.7176632  -0.10283854  0.18582343  0.48952126 -0.13673097  0.38021901
  0.20193293  0.56234343  0.49659658  0.20842082  0.10989558  0.63946748
  0.2843138  -0.11123212  0.25758051  0.34193699  0.24639161  0.25362344
  0.20326324  0.25905218  0.37114467  0.4198046   0.29349179  0.33949627
  0.47307145  0.24185348  0.48609743  0.07951602  0.1748931   0.3969155
  0.03909352  0.30168987  0.14959739  0.459705    0.32343991  0.33298549
  0.08073337 -0.06864767  0.14462426  0.13812344  0.12602278  0.10481928
  0.01436866 -0.1226296   0.18451102 -0.01757932  0.32846533  0.41358559
  0.33767081  0.21503052  0.45061739  0.24765401  0.02979894  0.07933882
 -0.13415231  0.25978153]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.36807271653053697, 0.3341363616556508, 0.39820873737209206], [0.24267877769855115, 0.2973521909173338, 0.40451618020483854, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.21167199  0.34604282  0.42341412  0.27581717  0.34670084  0.32624518
  0.36055807  0.28409184  0.10169843  0.44337599  0.30236686  0.37822921
  0.39090345  0.45273268  0.42980351  0.47322067  0.39619006  0.25734704
  0.38545052  0.23341002  0.45091352  0.29326084  0.44682651  0.49306909
  0.30439853  0.33657521  0.21714732  0.55085541  0.20939796 -0.03020107
  0.2937937   0.2666725   0.41034243  0.29331957 -0.03494965  0.43363016
  0.2165304   0.15400047  0.29929825  0.30825073  0.02122265  0.12641169
  0.52569942  0.23795422  0.08781797 -0.12468689  0.81839093  0.48920565
  0.32811495  0.15092454 -0.00961066  0.01137575  0.52963701  0.39309425
 -0.00849638  0.30403488  0.45341643  0.34980991  0.46501962  0.77660208
  0.05399025  0.22726208 -0.08170064  0.19443927  0.3526854   0.21782339
  0.25689777  0.48374695  0.3057977  -0.18052771  0.01515315  0.18727399
  0.35323641  0.3784467   0.22267637  0.17456367  0.23336146  0.58028475
  0.53138907  0.33652914  0.24870454  0.23234677  0.21167504  0.24473895
  0.17725181  0.16216517  0.2398639   0.48677245  0.19364316  0.08586176
  0.32670308 -0.21004627  0.2093012   0.25883237 -0.12697661  0.18853237
  0.19319398  0.25352958  0.25447042  0.20428507]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574], [0.2699766326061984, 0.43664479101758474, 0.441828670924199, 0.2941347128912874, 0.3541301283812711], [0.4418286709241979, 0.3832022746629383, 0.30155642826846296, 0.3304427693794233, 0.3120386944680726]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:32<00:00, 32.24s/it]
predicting: 100%|██████████| 1/1 [00:32<00:00, 32.24s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:33<02:15, 33.78s/it]
predicting:  40%|████      | 2/5 [01:03<01:33, 31.27s/it]
predicting:  60%|██████    | 3/5 [01:38<01:06, 33.05s/it]
predicting:  80%|████████  | 4/5 [02:08<00:31, 31.78s/it]
predicting: 100%|██████████| 5/5 [02:41<00:00, 32.29s/it]
predicting: 100%|██████████| 5/5 [02:41<00:00, 32.30s/it]

***********************************
S_worst_ind  2

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996, 0.24499999999999997, 0.24499999999999997]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35, 0.35, 0.35]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25], [0.35, 0.3, 0.3, 0.25, 0.2], [0.35, 0.35, 0.2, 0.2, 0.3]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28, 0.27999999999999997, 0.27999999999999997]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2, 0.2, 0.2]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4, 0.35, 0.35]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.31713813e-01  2.24711544e-01  1.53573440e-01  9.72027148e-02
  2.77713168e-01  2.70659343e-01  6.79425198e-02  4.07317648e-01
 -1.01728853e-01  1.30165444e-01 -7.04159776e-03  9.27939078e-02
  1.28214019e-01  2.05474696e-01  8.86433456e-02  1.30355604e-01
 -9.60269869e-02  2.56005415e-01  1.74120812e-01  3.57588083e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  5.24690158e-01  3.50233939e-01  3.99940918e-01  1.06693570e-01
 -2.09571006e-02  8.26679478e-02  2.75716371e-01  6.09104634e-01
 -3.02329400e-01  4.31417333e-01  3.50023392e-01  4.75297318e-01
  5.43725592e-01  3.54916695e-01  4.03118951e-01  1.00980837e+00
  4.14751936e-01  1.63207859e-01  5.00754705e-01  1.57470285e-01
  3.40862667e-02  3.05175099e-02  1.73898428e-01  1.56279770e-01
  3.69380460e-01  1.43785032e-01  2.18279954e-01  1.08592026e-01
  3.12363190e-01  2.19045003e-02  4.07790559e-01  4.27549139e-02
  2.44483363e-01  3.51555418e-01  2.50654377e-02  5.45097641e-01
  2.35415554e-01  1.41903485e-01  2.25915592e-01  2.60763649e-01
  1.67998823e-01  6.44556358e-02  1.42515970e-01  3.41822227e-02
  7.32953674e-02  1.80483654e-01  2.16117257e-01  4.97271723e-02
  8.16811992e-02  2.74081387e-01  5.89584494e-02  3.01861363e-01
  1.45132399e-01  1.94541750e-01  4.15457156e-01  2.74847678e-01
  1.84347449e-01 -1.22814177e-01  1.89865497e-01 -2.75550208e-02
  4.62516545e-01  4.72782717e-01  4.27771530e-01 -8.41415873e-02
  3.70853746e-01  5.42368229e-01  3.56742406e-01  5.74722565e-01
 -1.52332433e-02  2.73433163e-01  3.40957686e-01  1.42768882e-01
  4.15936007e-01  2.11006632e-01 -1.43518296e-01  4.61376012e-01
  2.96644493e-01 -1.32441451e-01  7.87396505e-01  1.40111432e-01
  3.38566446e-01  5.33779673e-01  5.25730103e-01  6.33801888e-03
  7.17663195e-01 -1.02838543e-01  1.85823426e-01  4.89521264e-01
 -1.36730966e-01  3.80219010e-01  2.01932929e-01  5.62343432e-01
  4.96596575e-01  2.08420816e-01  1.09895577e-01  6.39467480e-01
  2.84313799e-01 -1.11232123e-01  2.57580507e-01  3.41936991e-01
  2.46391606e-01  2.53623442e-01  2.03263237e-01  2.59052182e-01
  3.71144666e-01  4.19804595e-01  2.93491792e-01  3.39496272e-01
  4.73071453e-01  2.41853484e-01  4.86097432e-01  7.95160198e-02
  1.74893105e-01  3.96915497e-01  3.90935152e-02  3.01689871e-01
  1.49597390e-01  4.59704998e-01  3.23439914e-01  3.32985486e-01
  8.07333688e-02 -6.86476712e-02  1.44624263e-01  1.38123443e-01
  1.26022776e-01  1.04819281e-01  1.43686597e-02 -1.22629603e-01
  1.84511018e-01 -1.75793183e-02  3.28465334e-01  4.13585588e-01
  3.37670808e-01  2.15030524e-01  4.50617386e-01  2.47654009e-01
  2.97989442e-02  7.93388190e-02 -1.34152310e-01  2.59781534e-01
 -6.09081382e-15 -6.19801959e-15 -6.31445295e-15 -5.65220984e-15
 -6.57765595e-15 -6.44431421e-15 -6.16526344e-15 -5.84938932e-15
 -6.14704849e-15 -5.92736253e-15 -6.70408299e-15 -6.22346071e-15
 -5.89691902e-15 -6.17691521e-15 -6.23751001e-15 -5.98606105e-15
 -6.74213178e-15 -5.54284367e-15 -6.68437294e-15 -4.72889828e-15]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.3341363616556508, 0.39820873737209206, 0.5413699612501198], [0.24267877769855115, 0.2973521909173338, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015, 0.3500000000000061]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [0.21167199 0.34604282 0.42341412 0.27581717 0.34670084 0.32624518
 0.36055807 0.28409184 0.10169843 0.44337599 0.30236686 0.37822921
 0.39090345 0.45273268 0.42980351 0.47322067 0.39619006 0.25734704
 0.38545052 0.23341002 0.59154531 0.33897309 0.31204881 0.52657586
 0.21463722 0.36962173 0.26934965 0.47993553 0.12773773 0.08161826
 0.27686579 0.35466079 0.45915224 0.43924284 0.29804132 0.37811537
 0.05210179 0.44775751 0.21242823 0.69355498 5.49571059 5.61536164
 5.89568293 5.32213747 5.51869686 5.2250555  5.93902018 5.59882922
 5.15885921 5.36270642 5.12655334 5.38159759 5.64587317 5.51120211
 5.45735154 4.9771957  5.84760974 5.1474865  5.58205594 4.97126979
 1.69393118 1.63373773 1.49255662 1.52335234 1.4242551  1.58317986
 1.48247703 1.58546302 1.31885502 1.61410071 1.35995021 1.57295629
 1.37833858 1.43659425 1.60929092 1.3743317  1.60832412 1.36814606
 1.45940009 1.18685871 0.6696319  0.70863362 0.70355424 0.73702444
 0.67617496 0.67456685 0.64715858 0.65408207 0.64547531 0.65244928
 0.67661251 0.6708697  0.70243635 0.67948546 0.73746115 0.60214513
 0.6976633  0.65831993 0.65141877 0.6361835 ]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683], [0.441828670924199, 0.38406726017293075, 0.40305602082422726, 0.5268244446936265, 0.8924325973686938], [0.4418286709241979, 0.40451618020483854, 5.2390127728398594, 1.2853049767603693, 0.5816677302860594]]

overlaps  [[0, 0, 0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 1, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.04444444444444444]
MIN_overlap  [0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [9.27036226e-15 6.66133815e-15 1.11859730e-01 ... 0.00000000e+00
 0.00000000e+00 0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.31647134e-01  2.24777732e-01  1.53570007e-01  9.71431380e-02
  2.77710675e-01  2.70644035e-01  6.79394788e-02  4.07239060e-01
 -1.01760715e-01  1.30014993e-01 -6.99562811e-03  9.27144068e-02
  1.28320525e-01  2.05523717e-01  8.86518330e-02  1.30388408e-01
 -9.59836034e-02  2.56020315e-01  1.74269655e-01  3.57471827e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  5.24690158e-01  3.50233939e-01  3.99940918e-01  1.06693570e-01
 -2.09571006e-02  8.26679478e-02  2.75716371e-01  6.09104634e-01
 -3.02329400e-01  4.31417333e-01  3.50023392e-01  4.75297318e-01
  5.43725592e-01  3.54916695e-01  4.03118951e-01  1.00980837e+00
  4.14751936e-01  1.63207859e-01  5.00754705e-01  1.57470285e-01
  3.41162511e-02  3.05500447e-02  1.74018936e-01  1.56282691e-01
  3.69271847e-01  1.43807533e-01  2.18325141e-01  1.08592899e-01
  3.12318285e-01  2.18873972e-02  4.07611686e-01  4.27397252e-02
  2.44709262e-01  3.51627792e-01  2.50231883e-02  5.45114994e-01
  2.35352468e-01  1.42001143e-01  2.25808339e-01  2.60665278e-01
  1.30101074e-01  8.34289838e-02  1.75930387e-01  2.78498181e-02
  1.22517967e-01  1.74350782e-01  2.17462665e-01  3.31489485e-02
  9.20829316e-02  2.46258909e-01  5.33365965e-02  3.01532111e-01
  1.57589432e-01  1.50734657e-01  3.91928374e-01  2.78788732e-01
  1.59330776e-01 -9.06643240e-02  1.80939987e-01  1.54169310e-02
  4.62516545e-01  4.72782717e-01  4.27771530e-01 -8.41415873e-02
  3.70853746e-01  5.42368229e-01  3.56742406e-01  5.74722565e-01
 -1.52332433e-02  2.73433163e-01  3.40957686e-01  1.42768882e-01
  4.15936007e-01  2.11006632e-01 -1.43518296e-01  4.61376012e-01
  2.96644493e-01 -1.32441451e-01  7.87396505e-01  1.40111432e-01
  3.35431075e-01  5.46763410e-01  5.29496740e-01  2.35901169e-02
  7.15593564e-01 -1.12298096e-01  1.76047254e-01  5.06818096e-01
 -1.29899579e-01  3.82184846e-01  2.04919644e-01  5.33797976e-01
  4.82195848e-01  2.06584159e-01  8.88180868e-02  6.31979952e-01
  2.71928719e-01 -9.88111826e-02  2.78040085e-01  3.52186196e-01
  2.47727082e-01  2.46337029e-01  1.98342954e-01  2.54128599e-01
  3.79235632e-01  4.22010209e-01  2.91200142e-01  3.52715658e-01
  4.75028449e-01  2.35653660e-01  4.95924690e-01  7.61112392e-02
  1.76115754e-01  3.95324434e-01  4.00264327e-02  3.00307697e-01
  1.50715863e-01  4.53475508e-01  3.19745134e-01  3.38458254e-01
  1.34858979e-01 -6.62265000e-02  1.27088566e-01  1.27567169e-01
  1.28650945e-01  6.18758289e-02 -1.69186676e-04 -5.92787143e-02
  1.36059777e-01 -7.55422473e-03  3.26297401e-01  4.14001772e-01
  3.18375404e-01  2.32112328e-01  4.76829756e-01  2.09888010e-01
  1.14030226e-02  9.77570783e-02 -1.65522083e-01  3.13401835e-01
  3.85606607e-01  3.30666361e-01  2.98136912e-01  3.37438487e-01
  5.29173532e-01  4.33821986e-01  4.08678194e-01  5.24246205e-01
  3.24397888e-01  3.07576169e-01  3.16840567e-01  3.54489033e-01
  2.51758340e-01  3.17945889e-01  1.53553898e-01  3.89656821e-01
  2.37220320e-01  2.95545296e-01  3.33250736e-01  3.43167799e-01]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.36807271653053697, 0.3341363616556508, 0.39820873737209206], [0.24267877769855115, 0.2973521909173338, 0.40451618020483854, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.442751308909726]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.21167199  0.34604282  0.42341412  0.27581717  0.34670084  0.32624518
  0.36055807  0.28409184  0.10169843  0.44337599  0.30236686  0.37822921
  0.39090345  0.45273268  0.42980351  0.47322067  0.39619006  0.25734704
  0.38545052  0.23341002  0.59154531  0.33897309  0.31204881  0.52657586
  0.21463722  0.36962173  0.26934965  0.47993553  0.12773773  0.08161826
  0.27686579  0.35466079  0.45915224  0.43924284  0.29804132  0.37811537
  0.05210179  0.44775751  0.21242823  0.69355498 -0.10036081  0.21612988
  0.58250496  0.01291942  0.29319511 -0.04638513  0.30133014 -0.11486829
  0.32533318  0.07345925  0.24943245 -0.02141308  0.41849007  0.05160012
  0.24700001  0.36657436  0.52010566 -0.16839189  0.60256117  0.06049029
  0.25437754  0.16972075  0.15331861  0.28993341  0.19302216  0.30203493
  0.23999518  0.33349633  0.31019751  0.1009276   0.05555707  0.12467912
  0.11642619  0.00953747  0.15660916  0.2423172   0.18074178  0.0451992
  0.27103142  0.21818752  0.41635525  0.27539524  0.21278763  0.39648455
  0.41256718  0.25198773  0.18515323  0.49858339  0.2046155   0.38400181
  0.18992454  0.27173504  0.10365128  0.05671249  0.34923647  0.43258696
  0.2618829   0.27319131  0.18397252  0.48419014]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574], [0.2699766326061984, 0.43664479101758474, 0.441828670924199, 0.2941347128912874, 0.3541301283812711], [0.4418286709241979, 0.3832022746629383, 0.30155642826846296, 0.3304427693794233, 0.3120386944680726], [0.44182867092419825, 0.404516180204851, 0.25851996965785634, 0.3027439819624869, 0.39204624029599416]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:30<00:00, 30.57s/it]
predicting: 100%|██████████| 1/1 [00:30<00:00, 30.57s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:31<02:05, 31.41s/it]
predicting:  40%|████      | 2/5 [01:02<01:34, 31.44s/it]
predicting:  60%|██████    | 3/5 [01:35<01:04, 32.14s/it]
predicting:  80%|████████  | 4/5 [02:04<00:30, 30.86s/it]
predicting: 100%|██████████| 5/5 [02:41<00:00, 33.00s/it]
predicting: 100%|██████████| 5/5 [02:41<00:00, 32.30s/it]

***********************************
S_worst_ind  9

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996, 0.24499999999999997, 0.24499999999999997, 0.22999999999999998]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35, 0.35, 0.35, 0.35]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25], [0.35, 0.3, 0.3, 0.25, 0.2], [0.35, 0.35, 0.2, 0.2, 0.3], [0.3, 0.2, 0.3, 0.35, 0.4]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28, 0.27999999999999997, 0.27999999999999997, 0.30999999999999994]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2, 0.2, 0.2, 0.2]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4, 0.35, 0.35, 0.4]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 3.31647134e-01  2.24777732e-01  1.53570007e-01  9.71431380e-02
  2.77710675e-01  2.70644035e-01  6.79394788e-02  4.07239060e-01
 -1.01760715e-01  1.30014993e-01 -6.99562811e-03  9.27144068e-02
  1.28320525e-01  2.05523717e-01  8.86518330e-02  1.30388408e-01
 -9.59836034e-02  2.56020315e-01  1.74269655e-01  3.57471827e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  5.24690158e-01  3.50233939e-01  3.99940918e-01  1.06693570e-01
 -2.09571006e-02  8.26679478e-02  2.75716371e-01  6.09104634e-01
 -3.02329400e-01  4.31417333e-01  3.50023392e-01  4.75297318e-01
  5.43725592e-01  3.54916695e-01  4.03118951e-01  1.00980837e+00
  4.14751936e-01  1.63207859e-01  5.00754705e-01  1.57470285e-01
  3.41162511e-02  3.05500447e-02  1.74018936e-01  1.56282691e-01
  3.69271847e-01  1.43807533e-01  2.18325141e-01  1.08592899e-01
  3.12318285e-01  2.18873972e-02  4.07611686e-01  4.27397252e-02
  2.44709262e-01  3.51627792e-01  2.50231883e-02  5.45114994e-01
  2.35352468e-01  1.42001143e-01  2.25808339e-01  2.60665278e-01
  1.30101074e-01  8.34289838e-02  1.75930387e-01  2.78498181e-02
  1.22517967e-01  1.74350782e-01  2.17462665e-01  3.31489485e-02
  9.20829316e-02  2.46258909e-01  5.33365965e-02  3.01532111e-01
  1.57589432e-01  1.50734657e-01  3.91928374e-01  2.78788732e-01
  1.59330776e-01 -9.06643240e-02  1.80939987e-01  1.54169310e-02
  4.62516545e-01  4.72782717e-01  4.27771530e-01 -8.41415873e-02
  3.70853746e-01  5.42368229e-01  3.56742406e-01  5.74722565e-01
 -1.52332433e-02  2.73433163e-01  3.40957686e-01  1.42768882e-01
  4.15936007e-01  2.11006632e-01 -1.43518296e-01  4.61376012e-01
  2.96644493e-01 -1.32441451e-01  7.87396505e-01  1.40111432e-01
  3.35431075e-01  5.46763410e-01  5.29496740e-01  2.35901169e-02
  7.15593564e-01 -1.12298096e-01  1.76047254e-01  5.06818096e-01
 -1.29899579e-01  3.82184846e-01  2.04919644e-01  5.33797976e-01
  4.82195848e-01  2.06584159e-01  8.88180868e-02  6.31979952e-01
  2.71928719e-01 -9.88111826e-02  2.78040085e-01  3.52186196e-01
  2.47727082e-01  2.46337029e-01  1.98342954e-01  2.54128599e-01
  3.79235632e-01  4.22010209e-01  2.91200142e-01  3.52715658e-01
  4.75028449e-01  2.35653660e-01  4.95924690e-01  7.61112392e-02
  1.76115754e-01  3.95324434e-01  4.00264327e-02  3.00307697e-01
  1.50715863e-01  4.53475508e-01  3.19745134e-01  3.38458254e-01
  1.34858979e-01 -6.62265000e-02  1.27088566e-01  1.27567169e-01
  1.28650945e-01  6.18758289e-02 -1.69186676e-04 -5.92787143e-02
  1.36059777e-01 -7.55422473e-03  3.26297401e-01  4.14001772e-01
  3.18375404e-01  2.32112328e-01  4.76829756e-01  2.09888010e-01
  1.14030226e-02  9.77570783e-02 -1.65522083e-01  3.13401835e-01
 -5.60065807e-15 -5.03154704e-15 -4.94622377e-15 -5.45265075e-15
 -4.95693340e-15 -4.91802511e-15 -4.91234543e-15 -5.56039731e-15
 -4.64727726e-15 -4.81095494e-15 -4.87604048e-15 -4.92733593e-15
 -5.14040321e-15 -4.80145434e-15 -4.98260008e-15 -4.66017384e-15
 -4.92830254e-15 -5.32126755e-15 -4.72959271e-15 -4.48844593e-15]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.3341363616556508, 0.39820873737209206, 0.5413699612501198], [0.24267877769855115, 0.2973521909173338, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015, 0.3500000000000061], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.20000000000000498]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [0.38560661 0.33066636 0.29813691 0.33743849 0.52917353 0.43382199
 0.40867819 0.5242462  0.32439789 0.30757617 0.31684057 0.35448903
 0.25175834 0.31794589 0.1535539  0.38965682 0.23722032 0.2955453
 0.33325074 0.3431678  1.0817072  1.10644112 1.05306328 1.14544446
 1.13040319 1.11013466 1.06853138 1.13034591 1.08766799 1.07948072
 1.18454251 1.02458508 1.09120668 1.09012226 1.06913169 0.98416702
 1.19258369 1.06580639 1.11525957 1.0063612  0.56294189 0.4851484
 0.47730566 0.56250655 0.50283866 0.48712107 0.47319824 0.52595259
 0.47875213 0.4802833  0.47767753 0.5092049  0.50501044 0.46318532
 0.50031211 0.42476161 0.49055634 0.54460899 0.43732728 0.49497958
 0.59154531 0.33897309 0.31204881 0.52657586 0.21463722 0.36962173
 0.26934965 0.47993553 0.12773773 0.08161826 0.27686579 0.35466079
 0.45915224 0.43924284 0.29804132 0.37811537 0.05210179 0.44775751
 0.21242823 0.69355498 4.48168481 4.53949278 4.73636675 4.31167426
 4.47186834 4.20486718 4.75484581 4.54650383 4.12690059 4.36771253
 4.16169824 4.36230811 4.54115644 4.46214728 4.45988527 3.990856
 4.70810128 4.17588862 4.46479065 4.03820279]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683], [0.441828670924199, 0.38406726017293075, 0.40305602082422726, 0.5268244446936265, 0.8924325973686938], [0.4418286709241979, 0.40451618020483854, 5.2390127728398594, 1.2853049767603693, 0.5816677302860594], [0.4235089257700596, 0.8924325973687317, 0.5079158217231468, 0.404516180204851, 3.995347576959955]]

overlaps  [[0, 0, 0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 1, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [-4.77395901e-15 -6.66133815e-16  1.34912949e+00 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.33171381  0.22471154  0.15357344  0.09720271  0.27771317  0.27065934
  0.06794252  0.40731765 -0.10172885  0.13016544 -0.0070416   0.09279391
  0.12821402  0.2054747   0.08864335  0.1303556  -0.09602699  0.25600542
  0.17412081  0.03575881  0.05669492  0.23865117  0.30983121  0.05799554
  0.09385079  0.12771378  0.27306829  0.14448473  0.12764528  0.25251942
  0.27459064  0.15846121  0.16533228  0.21105674  0.29745928  0.42179826
  0.39303333  0.06537089  0.14958501  0.10881832  0.52469016  0.35023394
  0.39994092  0.10669357 -0.0209571   0.08266795  0.27571637  0.60910463
 -0.3023294   0.43141733  0.35002339  0.47529732  0.54372559  0.3549167
  0.40311895  1.00980837  0.41475194  0.16320786  0.5007547   0.15747028
  0.03408627  0.03051751  0.17389843  0.15627977  0.36938046  0.14378503
  0.21827995  0.10859203  0.31236319  0.0219045   0.40779056  0.04275491
  0.24448336  0.35155542  0.02506544  0.54509764  0.23541555  0.14190349
  0.22591559  0.26076365  0.16799882  0.06445564  0.14251597  0.03418222
  0.07329537  0.18048365  0.21611726  0.04972717  0.0816812   0.27408139
  0.05895845  0.30186136  0.1451324   0.19454175  0.41545716  0.27484768
  0.18434745 -0.12281418  0.1898655  -0.02755502  0.46251654  0.47278272
  0.42777153 -0.08414159  0.37085375  0.54236823  0.35674241  0.57472257
 -0.01523324  0.27343316  0.34095769  0.14276888  0.41593601  0.21100663
 -0.1435183   0.46137601  0.29664449 -0.13244145  0.78739651  0.14011143
  0.33543107  0.54676341  0.52949674  0.02359012  0.71559356 -0.1122981
  0.17604725  0.5068181  -0.12989958  0.38218485  0.20491964  0.53379798
  0.48219585  0.20658416  0.08881809  0.63197995  0.27192872 -0.09881118
  0.27804009  0.3521862   0.24639161  0.25362344  0.20326324  0.25905218
  0.37114467  0.4198046   0.29349179  0.33949627  0.47307145  0.24185348
  0.48609743  0.07951602  0.1748931   0.3969155   0.03909352  0.30168987
  0.14959739  0.459705    0.32343991  0.33298549  0.07329588 -0.06898037
  0.14703387  0.139574    0.12566164  0.11072021  0.01636633 -0.13133475
  0.19116878 -0.01895688  0.32876323  0.4135284   0.34032222  0.21268329
  0.4470155   0.25284349  0.03232676  0.07680794 -0.12984174  0.25241348
  0.28411536 -0.01280278  0.06336972  0.07160814  0.10047051  0.3871082
  0.32507695  0.51207936  0.20267353  0.02332904  0.10722372  0.07442154
  0.14077513  0.15308255  0.28279133  0.31112127  0.05339112  0.21368885
  0.29889329  0.38866572]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.36807271653053697, 0.3341363616556508, 0.39820873737209206], [0.24267877769855115, 0.2973521909173338, 0.40451618020483854, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.442751308909726], [0.242678777698551, 0.29735219091733733, 0.3520643368808044, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.3241963   0.09005276  0.20733371  0.25769999  0.50534853  0.49892952
  0.34936844  0.40058473  0.43353798  0.0489613   0.49166466  0.29936054
  0.22514103  0.25812234  0.00168728  0.39245342  0.15956208  0.22441232
  0.3034307   0.42680075  0.24870454  0.23234677  0.21167504  0.24473895
  0.17725181  0.16216517  0.2398639   0.48677245  0.19364316  0.08586176
  0.32670308 -0.21004627  0.2093012   0.25883237 -0.12697661  0.18853237
  0.19319398  0.25352958  0.25447042  0.20428507  0.16895336  0.67656047
  0.34592599  0.24880647  0.12752867  0.53156228  0.40034366  0.57065323
 -0.12581774  0.69065904  0.03489294  0.31382005 -0.37228273  0.43545617
  0.41771231  0.532141    0.38337403 -0.16749687  0.59610165  0.1362417
  0.59154531  0.33897309  0.31204881  0.52657586  0.21463722  0.36962173
  0.26934965  0.47993553  0.12773773  0.08161826  0.27686579  0.35466079
  0.45915224  0.43924284  0.29804132  0.37811537  0.05210179  0.44775751
  0.21242823  0.69355498  0.14237339  0.2611567   0.64865692  0.39068503
  0.21528587  0.01590347  0.84377225  0.49059881  0.33431453  0.35360563
  0.05148182  0.22235847  0.61699659  0.47456771  0.20740445  0.41705639
  0.55344129  0.40932349  0.51536895  0.76812266]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574], [0.2699766326061984, 0.43664479101758474, 0.441828670924199, 0.2941347128912874, 0.3541301283812711], [0.4418286709241979, 0.3832022746629383, 0.30155642826846296, 0.3304427693794233, 0.3120386944680726], [0.44182867092419825, 0.404516180204851, 0.25851996965785634, 0.3027439819624869, 0.39204624029599416], [0.37967892205407766, 0.31203869446807575, 0.3258229735468181, 0.4045161802048402, 0.38575951518888096]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:29<00:00, 29.78s/it]
predicting: 100%|██████████| 1/1 [00:29<00:00, 29.78s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:29<01:57, 29.41s/it]
predicting:  40%|████      | 2/5 [01:02<01:35, 31.68s/it]
predicting:  60%|██████    | 3/5 [01:32<01:01, 31.00s/it]
predicting:  80%|████████  | 4/5 [02:08<00:32, 32.72s/it]
predicting: 100%|██████████| 5/5 [02:43<00:00, 33.65s/it]
predicting: 100%|██████████| 5/5 [02:43<00:00, 32.71s/it]

***********************************
S_worst_ind  2

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996, 0.24499999999999997, 0.24499999999999997, 0.22999999999999998, 0.22999999999999998]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35, 0.35, 0.35, 0.35, 0.35]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25], [0.35, 0.3, 0.3, 0.25, 0.2], [0.35, 0.35, 0.2, 0.2, 0.3], [0.3, 0.2, 0.3, 0.35, 0.4], [0.35, 0.25, 0.35, 0.25, 0.3]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28, 0.27999999999999997, 0.27999999999999997, 0.30999999999999994, 0.3]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2, 0.2, 0.2, 0.2, 0.25]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4, 0.35, 0.35, 0.4, 0.35]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.33171381  0.22471154  0.15357344  0.09720271  0.27771317  0.27065934
  0.06794252  0.40731765 -0.10172885  0.13016544 -0.0070416   0.09279391
  0.12821402  0.2054747   0.08864335  0.1303556  -0.09602699  0.25600542
  0.17412081  0.03575881  0.05669492  0.23865117  0.30983121  0.05799554
  0.09385079  0.12771378  0.27306829  0.14448473  0.12764528  0.25251942
  0.27459064  0.15846121  0.16533228  0.21105674  0.29745928  0.42179826
  0.39303333  0.06537089  0.14958501  0.10881832  0.03408627  0.03051751
  0.17389843  0.15627977  0.36938046  0.14378503  0.21827995  0.10859203
  0.31236319  0.0219045   0.40779056  0.04275491  0.24448336  0.35155542
  0.02506544  0.54509764  0.23541555  0.14190349  0.22591559  0.26076365
  0.16799882  0.06445564  0.14251597  0.03418222  0.07329537  0.18048365
  0.21611726  0.04972717  0.0816812   0.27408139  0.05895845  0.30186136
  0.1451324   0.19454175  0.41545716  0.27484768  0.18434745 -0.12281418
  0.1898655  -0.02755502  0.46251654  0.47278272  0.42777153 -0.08414159
  0.37085375  0.54236823  0.35674241  0.57472257 -0.01523324  0.27343316
  0.34095769  0.14276888  0.41593601  0.21100663 -0.1435183   0.46137601
  0.29664449 -0.13244145  0.78739651  0.14011143  0.33543107  0.54676341
  0.52949674  0.02359012  0.71559356 -0.1122981   0.17604725  0.5068181
 -0.12989958  0.38218485  0.20491964  0.53379798  0.48219585  0.20658416
  0.08881809  0.63197995  0.27192872 -0.09881118  0.27804009  0.3521862
  0.24639161  0.25362344  0.20326324  0.25905218  0.37114467  0.4198046
  0.29349179  0.33949627  0.47307145  0.24185348  0.48609743  0.07951602
  0.1748931   0.3969155   0.03909352  0.30168987  0.14959739  0.459705
  0.32343991  0.33298549  0.07329588 -0.06898037  0.14703387  0.139574
  0.12566164  0.11072021  0.01636633 -0.13133475  0.19116878 -0.01895688
  0.32876323  0.4135284   0.34032222  0.21268329  0.4470155   0.25284349
  0.03232676  0.07680794 -0.12984174  0.25241348  0.28411536 -0.01280278
  0.06336972  0.07160814  0.10047051  0.3871082   0.32507695  0.51207936
  0.20267353  0.02332904  0.10722372  0.07442154  0.14077513  0.15308255
  0.28279133  0.31112127  0.05339112  0.21368885  0.29889329  0.38866572
 -0.14947527  0.04507341 -0.0704259  -0.31862856 -0.22233764 -0.01194413
 -0.06705535 -0.13681846 -0.24886411  0.04136764 -0.20425003 -0.09448331
 -0.27536584 -0.08014987 -0.02902839 -0.01659034  0.001951   -0.41522755
  0.02733056 -0.43399873]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.3341363616556508, 0.39820873737209206, 0.5413699612501198], [0.24267877769855115, 0.2973521909173338, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015, 0.3500000000000061], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.20000000000000498], [0.242678777698551, 0.29735219091733733, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163, 0.48727790838879315]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.52469016  0.35023394  0.39994092  0.10669357 -0.0209571   0.08266795
  0.27571637  0.60910463 -0.3023294   0.43141733  0.35002339  0.47529732
  0.54372559  0.3549167   0.40311895  1.00980837  0.41475194  0.16320786
  0.5007547   0.15747028  0.54773496  0.55087201  0.55989148  0.49850612
  0.53960989  0.50086062  0.51537623  0.48637623  0.47563135  0.54560889
  0.52764199  0.56955241  0.54182993  0.53699382  0.56493543  0.53693554
  0.6054854   0.46547633  0.48672076  0.41582557  0.59154531  0.33897309
  0.31204881  0.52657586  0.21463722  0.36962173  0.26934965  0.47993553
  0.12773773  0.08161826  0.27686579  0.35466079  0.45915224  0.43924284
  0.29804132  0.37811537  0.05210179  0.44775751  0.21242823  0.69355498
  0.14237339  0.2611567   0.64865692  0.39068503  0.21528587  0.01590347
  0.84377225  0.49059881  0.33431453  0.35360563  0.05148182  0.22235847
  0.61699659  0.47456771  0.20740445  0.41705639  0.55344129  0.40932349
  0.51536895  0.76812266  4.80908174  4.9137837   5.15908192  4.65719468
  4.82919613  4.57224204  5.19700465  4.89931683  4.51431625  4.69269498
  4.48604665  4.7092259   4.94048315  4.82263776  4.77551523  4.35534961
  5.11701495  4.50436444  4.88463918  4.35016408]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683], [0.441828670924199, 0.38406726017293075, 0.40305602082422726, 0.5268244446936265, 0.8924325973686938], [0.4418286709241979, 0.40451618020483854, 5.2390127728398594, 1.2853049767603693, 0.5816677302860594], [0.4235089257700596, 0.8924325973687317, 0.5079158217231468, 0.404516180204851, 3.995347576959955], [0.3537308765669242, 0.51241785331602, 0.4045161802048402, 0.365809380659885, 4.459467692177759]]

overlaps  [[0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 1, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 1], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [ 1.55431223e-15 -6.88338275e-15  5.32907052e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.33168775  0.22473742  0.1535721   0.09717943  0.27771219  0.27065336
  0.06794133  0.40728693 -0.10174131  0.13010663 -0.00702363  0.09276283
  0.12825565  0.20549386  0.08864666  0.13036843 -0.09601003  0.25601124
  0.174179    0.03575426  0.05669492  0.23865117  0.30983121  0.05799554
  0.09385079  0.12771378  0.27306829  0.14448473  0.12764528  0.25251942
  0.27459064  0.15846121  0.16533228  0.21105674  0.29745928  0.42179826
  0.39303333  0.06537089  0.14958501  0.10881832  0.03411625  0.03055004
  0.17401894  0.15628269  0.36927185  0.14380753  0.21832514  0.1085929
  0.31231828  0.0218874   0.40761169  0.04273973  0.24470926  0.35162779
  0.02502319  0.54511499  0.23535247  0.14200114  0.22580834  0.26066528
  0.15318455  0.07187234  0.1555777   0.03170688  0.09253653  0.17808631
  0.21664318  0.04324673  0.08574725  0.26320555  0.05676086  0.30173266
  0.15000187  0.17741751  0.40625973  0.27638824  0.17456841 -0.11024677
  0.18637651 -0.01075724  0.46251654  0.47278272  0.42777153 -0.08414159
  0.37085375  0.54236823  0.35674241  0.57472257 -0.01523324  0.27343316
  0.34095769  0.14276888  0.41593601  0.21100663 -0.1435183   0.46137601
  0.29664449 -0.13244145  0.78739651  0.14011143  0.34123022  0.52274885
  0.52253002 -0.00831914  0.71942153 -0.09480183  0.19412914  0.4748261
 -0.14253483  0.37854886  0.19939545  0.58659529  0.50883125  0.20998122
  0.12780275  0.64582879  0.294836   -0.1217848   0.24019831  0.3332294
  0.24772708  0.24633703  0.19834295  0.2541286   0.37923563  0.42201021
  0.29120014  0.35271566  0.47502845  0.23565366  0.49592469  0.07611124
  0.17611575  0.39532443  0.04002643  0.3003077   0.15071586  0.45347551
  0.31974513  0.33845825  0.137176   -0.06612285  0.1263379   0.12711527
  0.12876345  0.0600375  -0.00079152 -0.05656678  0.13398567 -0.00712507
  0.3262046   0.41401959  0.3175494   0.23284357  0.47795186  0.20827132
  0.01061553  0.09854553 -0.16686496  0.31569722  0.28411536 -0.01280278
  0.06336972  0.07160814  0.10047051  0.3871082   0.32507695  0.51207936
  0.20267353  0.02332904  0.10722372  0.07442154  0.14077513  0.15308255
  0.28279133  0.31112127  0.05339112  0.21368885  0.29889329  0.38866572
  0.31519928  0.49543463  0.4807261   0.49170269  0.30099423  0.48291982
  0.44732056  0.51477357  0.32845559  0.48284658 -0.077511    0.11733549
  0.12272461  0.04507321  0.32035405  0.43947692  0.44438524  0.13912738
  0.64042949  0.29558916]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.36807271653053697, 0.3341363616556508, 0.39820873737209206], [0.24267877769855115, 0.2973521909173338, 0.40451618020483854, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.442751308909726], [0.242678777698551, 0.29735219091733733, 0.3520643368808044, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 0.4007424437637802]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.4185891   0.28296558  0.24718031  0.20541211 -0.05030321  0.18669821
  0.31132045  0.62308282 -0.2186119   0.04937619  0.44758411  0.55769205
  0.40367235  0.42276123  0.3766442   1.19849394  0.36015382  0.23951143
  0.64668986  0.23190563  0.27176061  0.26707368  0.1022309   0.13188939
  0.32786302  0.15173169  0.31877173  0.32915423  0.09335622  0.14826524
  0.08022364  0.30081999  0.38747879  0.33948488  0.23075891  0.3086462
  0.27884693  0.43041459  0.34418123  0.08420461  0.59154531  0.33897309
  0.31204881  0.52657586  0.21463722  0.36962173  0.26934965  0.47993553
  0.12773773  0.08161826  0.27686579  0.35466079  0.45915224  0.43924284
  0.29804132  0.37811537  0.05210179  0.44775751  0.21242823  0.69355498
  0.0725858   0.13917159  0.30890825  0.11735696  0.0938895  -0.01894121
  0.63674907  0.58129111  0.2933675   0.12908768  0.09327284 -0.0785141
  0.32860049  0.36156194 -0.05903117  0.17043443  0.34057176  0.29853138
  0.41329276  0.62398501  0.13614903  0.29065028  0.19330403  0.15174407
  0.28691874  0.21117014  0.44516099  0.48580328  0.38929935  0.03502283
  0.35917519  0.26321667  0.20732957  0.07700557  0.16811142  0.16170665
  0.46115252  0.21809239  0.61754816  0.58524922]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574], [0.2699766326061984, 0.43664479101758474, 0.441828670924199, 0.2941347128912874, 0.3541301283812711], [0.4418286709241979, 0.3832022746629383, 0.30155642826846296, 0.3304427693794233, 0.3120386944680726], [0.44182867092419825, 0.404516180204851, 0.25851996965785634, 0.3027439819624869, 0.39204624029599416], [0.37967892205407766, 0.31203869446807575, 0.3258229735468181, 0.4045161802048402, 0.38575951518888096], [0.3416127722541395, 0.3491886879903772, 0.40451618020483765, 0.2961968473389621, 0.3785311614495693]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:34<00:00, 34.56s/it]
predicting: 100%|██████████| 1/1 [00:34<00:00, 34.56s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:34<02:17, 34.43s/it]
predicting:  40%|████      | 2/5 [01:05<01:36, 32.28s/it]
predicting:  60%|██████    | 3/5 [01:35<01:02, 31.45s/it]
predicting:  80%|████████  | 4/5 [02:04<00:30, 30.36s/it]
predicting: 100%|██████████| 5/5 [02:35<00:00, 30.71s/it]
predicting: 100%|██████████| 5/5 [02:35<00:00, 31.13s/it]

***********************************
S_worst_ind  9

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996, 0.24499999999999997, 0.24499999999999997, 0.22999999999999998, 0.22999999999999998, 0.21999999999999997]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35, 0.35, 0.35, 0.35, 0.35, 0.3]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25], [0.35, 0.3, 0.3, 0.25, 0.2], [0.35, 0.35, 0.2, 0.2, 0.3], [0.3, 0.2, 0.3, 0.35, 0.4], [0.35, 0.25, 0.35, 0.25, 0.3], [0.3, 0.25, 0.35, 0.25, 0.2]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28, 0.27999999999999997, 0.27999999999999997, 0.30999999999999994, 0.3, 0.26999999999999996]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2, 0.2, 0.2, 0.2, 0.25, 0.2]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4, 0.35, 0.35, 0.4, 0.35, 0.35]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.31687748e-01  2.24737417e-01  1.53572098e-01  9.71794262e-02
  2.77712194e-01  2.70653359e-01  6.79413311e-02  4.07286928e-01
 -1.01741308e-01  1.30106632e-01 -7.02362818e-03  9.27628308e-02
  1.28255652e-01  2.05493859e-01  8.86466633e-02  1.30368427e-01
 -9.60100282e-02  2.56011240e-01  1.74178995e-01  3.57542639e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  3.41162511e-02  3.05500447e-02  1.74018936e-01  1.56282691e-01
  3.69271847e-01  1.43807533e-01  2.18325141e-01  1.08592899e-01
  3.12318285e-01  2.18873972e-02  4.07611686e-01  4.27397252e-02
  2.44709262e-01  3.51627792e-01  2.50231883e-02  5.45114994e-01
  2.35352468e-01  1.42001143e-01  2.25808339e-01  2.60665278e-01
  1.53184554e-01  7.18723370e-02  1.55577700e-01  3.17068792e-02
  9.25365320e-02  1.78086308e-01  2.16643179e-01  4.32467275e-02
  8.57472471e-02  2.63205552e-01  5.67608613e-02  3.01732658e-01
  1.50001866e-01  1.77417513e-01  4.06259731e-01  2.76388240e-01
  1.74568406e-01 -1.10246766e-01  1.86376506e-01 -1.07572407e-02
  4.62516545e-01  4.72782717e-01  4.27771530e-01 -8.41415873e-02
  3.70853746e-01  5.42368229e-01  3.56742406e-01  5.74722565e-01
 -1.52332433e-02  2.73433163e-01  3.40957686e-01  1.42768882e-01
  4.15936007e-01  2.11006632e-01 -1.43518296e-01  4.61376012e-01
  2.96644493e-01 -1.32441451e-01  7.87396505e-01  1.40111432e-01
  3.41230217e-01  5.22748855e-01  5.22530016e-01 -8.31914425e-03
  7.19421527e-01 -9.48018263e-02  1.94129137e-01  4.74826095e-01
 -1.42534826e-01  3.78548860e-01  1.99395453e-01  5.86595289e-01
  5.08831251e-01  2.09981217e-01  1.27802745e-01  6.45828789e-01
  2.94836005e-01 -1.21784795e-01  2.40198309e-01  3.33229396e-01
  2.47727082e-01  2.46337029e-01  1.98342954e-01  2.54128599e-01
  3.79235632e-01  4.22010209e-01  2.91200142e-01  3.52715658e-01
  4.75028449e-01  2.35653660e-01  4.95924690e-01  7.61112392e-02
  1.76115754e-01  3.95324434e-01  4.00264327e-02  3.00307697e-01
  1.50715863e-01  4.53475508e-01  3.19745134e-01  3.38458254e-01
  1.37175996e-01 -6.61228541e-02  1.26337895e-01  1.27115274e-01
  1.28763452e-01  6.00374992e-02 -7.91524960e-04 -5.65667800e-02
  1.33985669e-01 -7.12506903e-03  3.26204596e-01  4.14019588e-01
  3.17549403e-01  2.32843568e-01  4.77951859e-01  2.08271318e-01
  1.06155272e-02  9.85455299e-02 -1.66864965e-01  3.15697221e-01
  2.84115362e-01 -1.28027850e-02  6.33697161e-02  7.16081405e-02
  1.00470514e-01  3.87108201e-01  3.25076953e-01  5.12079363e-01
  2.02673525e-01  2.33290404e-02  1.07223717e-01  7.44215374e-02
  1.40775132e-01  1.53082553e-01  2.82791325e-01  3.11121265e-01
  5.33911167e-02  2.13688854e-01  2.98893292e-01  3.88665724e-01
 -9.16789520e-01 -9.70186572e-01 -9.63232428e-01 -1.00905629e+00
 -9.25747593e-01 -9.23545937e-01 -8.86021418e-01 -8.95500328e-01
 -8.83716868e-01 -8.93264894e-01 -9.26346637e-01 -9.18484183e-01
 -9.61701926e-01 -9.30279985e-01 -1.00965419e+00 -8.24393745e-01
 -9.55167164e-01 -9.01302370e-01 -8.91854021e-01 -8.70995492e-01]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.3341363616556508, 0.39820873737209206, 0.5413699612501198], [0.24267877769855115, 0.2973521909173338, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015, 0.3500000000000061], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.20000000000000498], [0.242678777698551, 0.29735219091733733, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163, 0.48727790838879315], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 1.1728620783598136]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.13614903  0.29065028  0.19330403  0.15174407  0.28691874  0.21117014
  0.44516099  0.48580328  0.38929935  0.03502283  0.35917519  0.26321667
  0.20732957  0.07700557  0.16811142  0.16170665  0.46115252  0.21809239
  0.61754816  0.58524922  0.31519928  0.49543463  0.4807261   0.49170269
  0.30099423  0.48291982  0.44732056  0.51477357  0.32845559  0.48284658
 -0.077511    0.11733549  0.12272461  0.04507321  0.32035405  0.43947692
  0.44438524  0.13912738  0.64042949  0.29558916  0.59154531  0.33897309
  0.31204881  0.52657586  0.21463722  0.36962173  0.26934965  0.47993553
  0.12773773  0.08161826  0.27686579  0.35466079  0.45915224  0.43924284
  0.29804132  0.37811537  0.05210179  0.44775751  0.21242823  0.69355498
  0.4185891   0.28296558  0.24718031  0.20541211 -0.05030321  0.18669821
  0.31132045  0.62308282 -0.2186119   0.04937619  0.44758411  0.55769205
  0.40367235  0.42276123  0.3766442   1.19849394  0.36015382  0.23951143
  0.64668986  0.23190563  1.0817072   1.10644112  1.05306328  1.14544446
  1.13040319  1.11013466  1.06853138  1.13034591  1.08766799  1.07948072
  1.18454251  1.02458508  1.09120668  1.09012226  1.06913169  0.98416702
  1.19258369  1.06580639  1.11525957  1.0063612 ]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683], [0.441828670924199, 0.38406726017293075, 0.40305602082422726, 0.5268244446936265, 0.8924325973686938], [0.4418286709241979, 0.40451618020483854, 5.2390127728398594, 1.2853049767603693, 0.5816677302860594], [0.4235089257700596, 0.8924325973687317, 0.5079158217231468, 0.404516180204851, 3.995347576959955], [0.3537308765669242, 0.51241785331602, 0.4045161802048402, 0.365809380659885, 4.459467692177759], [0.38031777241113396, 0.40176326449938954, 0.40451618020483765, 0.30081871705957725, 0.8924325973687448]]

overlaps  [[0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 1, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.04444444444444444]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [ 2.27595720e-15 -1.19904087e-14 -6.21724894e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.31599842e-01  2.24824675e-01  1.53567572e-01  9.71008837e-02
  2.77708907e-01  2.70633177e-01  6.79373219e-02  4.07183322e-01
 -1.01783313e-01  1.29908286e-01 -6.96302458e-03  9.26580216e-02
  1.28396063e-01  2.05558485e-01  8.86578525e-02  1.30411675e-01
 -9.59528340e-02  2.56030883e-01  1.74375221e-01  3.57389374e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  3.40862667e-02  3.05175099e-02  1.73898428e-01  1.56279770e-01
  3.69380460e-01  1.43785032e-01  2.18279954e-01  1.08592026e-01
  3.12363190e-01  2.19045003e-02  4.07790559e-01  4.27549139e-02
  2.44483363e-01  3.51555418e-01  2.50654377e-02  5.45097641e-01
  2.35415554e-01  1.41903485e-01  2.25915592e-01  2.60763649e-01
  1.03222460e-01  9.68856486e-02  1.99629241e-01  2.33586210e-02
  1.57428622e-01  1.70001102e-01  2.18416883e-01  2.13910026e-02
  9.94602603e-02  2.26526084e-01  4.93493513e-02  3.01298593e-01
  1.66424462e-01  1.19664897e-01  3.75240811e-01  2.81583886e-01
  1.41587941e-01 -6.78623498e-02  1.74609654e-01  4.58943750e-02
  4.62516545e-01  4.72782717e-01  4.27771530e-01 -8.41415873e-02
  3.70853746e-01  5.42368229e-01  3.56742406e-01  5.74722565e-01
 -1.52332433e-02  2.73433163e-01  3.40957686e-01  1.42768882e-01
  4.15936007e-01  2.11006632e-01 -1.43518296e-01  4.61376012e-01
  2.96644493e-01 -1.32441451e-01  7.87396505e-01  1.40111432e-01
  3.35431075e-01  5.46763410e-01  5.29496740e-01  2.35901169e-02
  7.15593564e-01 -1.12298096e-01  1.76047254e-01  5.06818096e-01
 -1.29899579e-01  3.82184846e-01  2.04919644e-01  5.33797976e-01
  4.82195848e-01  2.06584159e-01  8.88180868e-02  6.31979952e-01
  2.71928719e-01 -9.88111826e-02  2.78040085e-01  3.52186196e-01
  2.46391606e-01  2.53623442e-01  2.03263237e-01  2.59052182e-01
  3.71144666e-01  4.19804595e-01  2.93491792e-01  3.39496272e-01
  4.73071453e-01  2.41853484e-01  4.86097432e-01  7.95160198e-02
  1.74893105e-01  3.96915497e-01  3.90935152e-02  3.01689871e-01
  1.49597390e-01  4.59704998e-01  3.23439914e-01  3.32985486e-01
  1.34858979e-01 -6.62265000e-02  1.27088566e-01  1.27567169e-01
  1.28650945e-01  6.18758289e-02 -1.69186676e-04 -5.92787143e-02
  1.36059777e-01 -7.55422473e-03  3.26297401e-01  4.14001772e-01
  3.18375404e-01  2.32112328e-01  4.76829756e-01  2.09888010e-01
  1.14030226e-02  9.77570783e-02 -1.65522083e-01  3.13401835e-01
  2.84115362e-01 -1.28027850e-02  6.33697161e-02  7.16081405e-02
  1.00470514e-01  3.87108201e-01  3.25076953e-01  5.12079363e-01
  2.02673525e-01  2.33290404e-02  1.07223717e-01  7.44215374e-02
  1.40775132e-01  1.53082553e-01  2.82791325e-01  3.11121265e-01
  5.33911167e-02  2.13688854e-01  2.98893292e-01  3.88665724e-01
  4.20364520e-01  2.52716099e-01  2.09193586e-01  2.55326575e-01
  4.50757069e-01  2.10101835e-01  2.48964042e-01  5.81311670e-01
  4.24988400e-02  4.65461463e-01 -2.15120738e-03  2.46527754e-01
 -3.19588921e-02 -7.40952851e-02  2.25594223e-01  5.02225669e-01
  2.01519536e-01  1.48937170e-01  1.38613618e-01  3.82927292e-01]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.36807271653053697, 0.3341363616556508, 0.39820873737209206], [0.24267877769855115, 0.2973521909173338, 0.40451618020483854, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.442751308909726], [0.242678777698551, 0.29735219091733733, 0.3520643368808044, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 0.4007424437637802], [0.24265249310254342, 0.2973521909173312, 0.27375205578351836, 0.23371984987019778, 0.340179336327695, 0.33413636165565175, 0.3986182455403887, 0.2270167672521204, 0.2817880422016302, 0.3231351337337901]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.20505762e-02  2.51063717e-01  4.53259272e-01  1.48365136e-01
  4.97506543e-01  6.74404077e-03  4.02913771e-01  2.06540453e-01
  4.34037912e-01  7.20241052e-02  4.33292798e-01  2.24289075e-01
  4.01873570e-01 -9.08877597e-03  3.75188627e-01  3.34060201e-01
  4.97933948e-01 -2.12513610e-02  6.53870370e-01  4.14758891e-01
  2.03286446e-01  3.15443993e-01  3.26104104e-01  3.52564284e-01
  2.25093638e-01  3.04042412e-01  3.12662243e-01  3.42395090e-01
  2.51223414e-01  3.00132389e-01 -4.32399246e-03  6.66968075e-02
  1.26767705e-01  4.49594196e-02  1.94982797e-01  2.91848699e-01
  2.91421848e-01  1.43311425e-01  4.26509992e-01  2.47673283e-01
  5.91545311e-01  3.38973090e-01  3.12048814e-01  5.26575858e-01
  2.14637225e-01  3.69621732e-01  2.69349650e-01  4.79935533e-01
  1.27737734e-01  8.16182621e-02  2.76865791e-01  3.54660792e-01
  4.59152239e-01  4.39242840e-01  2.98041324e-01  3.78115374e-01
  5.21017924e-02  4.47757515e-01  2.12428232e-01  6.93554982e-01
  3.02639029e-01  2.76507988e-01  3.67102340e-01 -9.76813950e-04
 -1.54110906e-01  7.07983970e-02  1.26295428e-01  6.68595078e-01
 -3.01908716e-01 -1.18878414e-01  3.44460642e-01  4.26866264e-01
  3.64029291e-01  3.07939729e-01  1.03187510e-01  1.14759658e+00
  1.72061983e-01  2.45707399e-01  5.83117190e-01  6.73646230e-02
  2.48704543e-01  2.32346770e-01  2.11675038e-01  2.44738948e-01
  1.77251811e-01  1.62165167e-01  2.39863901e-01  4.86772445e-01
  1.93643162e-01  8.58617643e-02  3.26703079e-01 -2.10046270e-01
  2.09301202e-01  2.58832369e-01 -1.26976611e-01  1.88532371e-01
  1.93193977e-01  2.53529578e-01  2.54470417e-01  2.04285068e-01]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574], [0.2699766326061984, 0.43664479101758474, 0.441828670924199, 0.2941347128912874, 0.3541301283812711], [0.4418286709241979, 0.3832022746629383, 0.30155642826846296, 0.3304427693794233, 0.3120386944680726], [0.44182867092419825, 0.404516180204851, 0.25851996965785634, 0.3027439819624869, 0.39204624029599416], [0.37967892205407766, 0.31203869446807575, 0.3258229735468181, 0.4045161802048402, 0.38575951518888096], [0.3416127722541395, 0.3491886879903772, 0.40451618020483765, 0.2961968473389621, 0.3785311614495693], [0.3497449903081983, 0.3526889104895011, 0.4045161802048372, 0.25324643402482755, 0.3120386944680744]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:33<00:00, 33.39s/it]
predicting: 100%|██████████| 1/1 [00:33<00:00, 33.39s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:28<01:54, 28.66s/it]
predicting:  40%|████      | 2/5 [01:04<01:39, 33.04s/it]
predicting:  60%|██████    | 3/5 [01:38<01:06, 33.27s/it]
predicting:  80%|████████  | 4/5 [02:08<00:32, 32.04s/it]
predicting: 100%|██████████| 5/5 [02:45<00:00, 33.74s/it]
predicting: 100%|██████████| 5/5 [02:45<00:00, 33.05s/it]

***********************************
S_worst_ind  5

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996, 0.24499999999999997, 0.24499999999999997, 0.22999999999999998, 0.22999999999999998, 0.21999999999999997, 0.21000000000000002]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35, 0.35, 0.35, 0.35, 0.35, 0.3, 0.3]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25], [0.35, 0.3, 0.3, 0.25, 0.2], [0.35, 0.35, 0.2, 0.2, 0.3], [0.3, 0.2, 0.3, 0.35, 0.4], [0.35, 0.25, 0.35, 0.25, 0.3], [0.3, 0.25, 0.35, 0.25, 0.2], [0.25, 0.2, 0.45, 0.35, 0.25]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28, 0.27999999999999997, 0.27999999999999997, 0.30999999999999994, 0.3, 0.26999999999999996, 0.3]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2, 0.2, 0.2, 0.2, 0.25, 0.2, 0.2]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4, 0.35, 0.35, 0.4, 0.35, 0.35, 0.45]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.31599842e-01  2.24824675e-01  1.53567572e-01  9.71008837e-02
  2.77708907e-01  2.70633177e-01  6.79373219e-02  4.07183322e-01
 -1.01783313e-01  1.29908286e-01 -6.96302458e-03  9.26580216e-02
  1.28396063e-01  2.05558485e-01  8.86578525e-02  1.30411675e-01
 -9.59528340e-02  2.56030883e-01  1.74375221e-01  3.57389374e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  3.40862667e-02  3.05175099e-02  1.73898428e-01  1.56279770e-01
  3.69380460e-01  1.43785032e-01  2.18279954e-01  1.08592026e-01
  3.12363190e-01  2.19045003e-02  4.07790559e-01  4.27549139e-02
  2.44483363e-01  3.51555418e-01  2.50654377e-02  5.45097641e-01
  2.35415554e-01  1.41903485e-01  2.25915592e-01  2.60763649e-01
  1.03222460e-01  9.68856486e-02  1.99629241e-01  2.33586210e-02
  1.57428622e-01  1.70001102e-01  2.18416883e-01  2.13910026e-02
  9.94602603e-02  2.26526084e-01  4.93493513e-02  3.01298593e-01
  1.66424462e-01  1.19664897e-01  3.75240811e-01  2.81583886e-01
  1.41587941e-01 -6.78623498e-02  1.74609654e-01  4.58943750e-02
  4.62516545e-01  4.72782717e-01  4.27771530e-01 -8.41415873e-02
  3.70853746e-01  5.42368229e-01  3.56742406e-01  5.74722565e-01
 -1.52332433e-02  2.73433163e-01  3.40957686e-01  1.42768882e-01
  4.15936007e-01  2.11006632e-01 -1.43518296e-01  4.61376012e-01
  2.96644493e-01 -1.32441451e-01  7.87396505e-01  1.40111432e-01
  2.46391606e-01  2.53623442e-01  2.03263237e-01  2.59052182e-01
  3.71144666e-01  4.19804595e-01  2.93491792e-01  3.39496272e-01
  4.73071453e-01  2.41853484e-01  4.86097432e-01  7.95160198e-02
  1.74893105e-01  3.96915497e-01  3.90935152e-02  3.01689871e-01
  1.49597390e-01  4.59704998e-01  3.23439914e-01  3.32985486e-01
  1.34858979e-01 -6.62265000e-02  1.27088566e-01  1.27567169e-01
  1.28650945e-01  6.18758289e-02 -1.69186676e-04 -5.92787143e-02
  1.36059777e-01 -7.55422473e-03  3.26297401e-01  4.14001772e-01
  3.18375404e-01  2.32112328e-01  4.76829756e-01  2.09888010e-01
  1.14030226e-02  9.77570783e-02 -1.65522083e-01  3.13401835e-01
  2.84115362e-01 -1.28027850e-02  6.33697161e-02  7.16081405e-02
  1.00470514e-01  3.87108201e-01  3.25076953e-01  5.12079363e-01
  2.02673525e-01  2.33290404e-02  1.07223717e-01  7.44215374e-02
  1.40775132e-01  1.53082553e-01  2.82791325e-01  3.11121265e-01
  5.33911167e-02  2.13688854e-01  2.98893292e-01  3.88665724e-01
  4.20364520e-01  2.52716099e-01  2.09193586e-01  2.55326575e-01
  4.50757069e-01  2.10101835e-01  2.48964042e-01  5.81311670e-01
  4.24988400e-02  4.65461463e-01 -2.15120738e-03  2.46527754e-01
 -3.19588921e-02 -7.40952851e-02  2.25594223e-01  5.02225669e-01
  2.01519536e-01  1.48937170e-01  1.38613618e-01  3.82927292e-01
 -1.36024118e+00 -1.46823139e+00 -1.44473929e+00 -1.60149098e+00
 -1.38700933e+00 -1.42117664e+00 -1.32856689e+00 -1.37729391e+00
 -1.36351565e+00 -1.31340898e+00 -1.40022394e+00 -1.34195058e+00
 -1.45961565e+00 -1.39905800e+00 -1.53630597e+00 -1.17875117e+00
 -1.38236036e+00 -1.41026874e+00 -1.36936087e+00 -1.39684635e+00]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.3341363616556508, 0.39820873737209206, 0.5413699612501198], [0.24267877769855115, 0.2973521909173338, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015, 0.3500000000000061], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.20000000000000498], [0.242678777698551, 0.29735219091733733, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163, 0.48727790838879315], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 1.1728620783598136], [0.24265249310254342, 0.2973521909173312, 0.27375205578351836, 0.23371984987019778, 0.340179336327695, 0.3986182455403887, 0.2270167672521204, 0.2817880422016302, 0.3231351337337901, 1.5970207932723892]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 1, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 3.02639029e-01  2.76507988e-01  3.67102340e-01 -9.76813950e-04
 -1.54110906e-01  7.07983970e-02  1.26295428e-01  6.68595078e-01
 -3.01908716e-01 -1.18878414e-01  3.44460642e-01  4.26866264e-01
  3.64029291e-01  3.07939729e-01  1.03187510e-01  1.14759658e+00
  1.72061983e-01  2.45707399e-01  5.83117190e-01  6.73646230e-02
  3.20505762e-02  2.51063717e-01  4.53259272e-01  1.48365136e-01
  4.97506543e-01  6.74404077e-03  4.02913771e-01  2.06540453e-01
  4.34037912e-01  7.20241052e-02  4.33292798e-01  2.24289075e-01
  4.01873570e-01 -9.08877597e-03  3.75188627e-01  3.34060201e-01
  4.97933948e-01 -2.12513610e-02  6.53870370e-01  4.14758891e-01
  4.09543316e+00  4.14483176e+00  4.32200408e+00  3.93762173e+00
  4.08400115e+00  3.83763781e+00  4.33743729e+00  4.15300475e+00
  3.76432365e+00  3.99080872e+00  3.80139183e+00  3.98407658e+00
  4.14435100e+00  4.07480684e+00  4.07632958e+00  3.64104680e+00
  4.29711731e+00  3.81411098e+00  4.07247044e+00  3.68881008e+00
  5.91545311e-01  3.38973090e-01  3.12048814e-01  5.26575858e-01
  2.14637225e-01  3.69621732e-01  2.69349650e-01  4.79935533e-01
  1.27737734e-01  8.16182621e-02  2.76865791e-01  3.54660792e-01
  4.59152239e-01  4.39242840e-01  2.98041324e-01  3.78115374e-01
  5.21017924e-02  4.47757515e-01  2.12428232e-01  6.93554982e-01
  3.35431075e-01  5.46763410e-01  5.29496740e-01  2.35901169e-02
  7.15593564e-01 -1.12298096e-01  1.76047254e-01  5.06818096e-01
 -1.29899579e-01  3.82184846e-01  2.04919644e-01  5.33797976e-01
  4.82195848e-01  2.06584159e-01  8.88180868e-02  6.31979952e-01
  2.71928719e-01 -9.88111826e-02  2.78040085e-01  3.52186196e-01]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683], [0.441828670924199, 0.38406726017293075, 0.40305602082422726, 0.5268244446936265, 0.8924325973686938], [0.4418286709241979, 0.40451618020483854, 5.2390127728398594, 1.2853049767603693, 0.5816677302860594], [0.4235089257700596, 0.8924325973687317, 0.5079158217231468, 0.404516180204851, 3.995347576959955], [0.3537308765669242, 0.51241785331602, 0.4045161802048402, 0.365809380659885, 4.459467692177759], [0.38031777241113396, 0.40176326449938954, 0.40451618020483765, 0.30081871705957725, 0.8924325973687448], [0.32010594178007995, 0.30990743115021624, 3.5630807786546397, 0.4045161802048372, 0.3375161592095289]]

overlaps  [[0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 1], [0, 0, 0, 0, 0, 0, 0, 0, 1]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.06666666666666668]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 1, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [ 3.30291350e-15 -7.99360578e-15  7.99360578e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.33171381  0.22471154  0.15357344  0.09720271  0.27771317  0.27065934
  0.06794252  0.40731765 -0.10172885  0.13016544 -0.0070416   0.09279391
  0.12821402  0.2054747   0.08864335  0.1303556  -0.09602699  0.25600542
  0.17412081  0.03575881  0.05669492  0.23865117  0.30983121  0.05799554
  0.09385079  0.12771378  0.27306829  0.14448473  0.12764528  0.25251942
  0.27459064  0.15846121  0.16533228  0.21105674  0.29745928  0.42179826
  0.39303333  0.06537089  0.14958501  0.10881832  0.03411625  0.03055004
  0.17401894  0.15628269  0.36927185  0.14380753  0.21832514  0.1085929
  0.31231828  0.0218874   0.40761169  0.04273973  0.24470926  0.35162779
  0.02502319  0.54511499  0.23535247  0.14200114  0.22580834  0.26066528
  0.16799882  0.06445564  0.14251597  0.03418222  0.07329537  0.18048365
  0.21611726  0.04972717  0.0816812   0.27408139  0.05895845  0.30186136
  0.1451324   0.19454175  0.41545716  0.27484768  0.18434745 -0.12281418
  0.1898655  -0.02755502  0.46251654  0.47278272  0.42777153 -0.08414159
  0.37085375  0.54236823  0.35674241  0.57472257 -0.01523324  0.27343316
  0.34095769  0.14276888  0.41593601  0.21100663 -0.1435183   0.46137601
  0.29664449 -0.13244145  0.78739651  0.14011143  0.24772708  0.24633703
  0.19834295  0.2541286   0.37923563  0.42201021  0.29120014  0.35271566
  0.47502845  0.23565366  0.49592469  0.07611124  0.17611575  0.39532443
  0.04002643  0.3003077   0.15071586  0.45347551  0.31974513  0.33845825
  0.07314715 -0.06898702  0.14708206  0.139603    0.12565441  0.11083821
  0.01640628 -0.13150883  0.19130192 -0.01898443  0.32876919  0.41352726
  0.34037524  0.21263635  0.44694348  0.25294727  0.03237731  0.07675733
 -0.12975554  0.25226614  0.28411536 -0.01280278  0.06336972  0.07160814
  0.10047051  0.3871082   0.32507695  0.51207936  0.20267353  0.02332904
  0.10722372  0.07442154  0.14077513  0.15308255  0.28279133  0.31112127
  0.05339112  0.21368885  0.29889329  0.38866572  0.37371403  0.23517157
  0.21496107  0.34281904  0.40420139  0.15702308  0.14109175  0.46959395
  0.10566513  0.3974615   0.10064347  0.2375689   0.05601056 -0.04978898
  0.31703712  0.45812671  0.22348994  0.19995525  0.11271702  0.41925977
  0.117646    0.19989191 -0.0924391   0.14458927  0.35661069  0.14846
  0.215575    0.46188589  0.4659551  -0.07150689  0.0691086   0.10103182
  0.22061792  0.1890939  -0.12683404  0.10112479  0.1883963   0.440082
  0.36648273  0.2682009 ]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.36807271653053697, 0.3341363616556508, 0.39820873737209206], [0.24267877769855115, 0.2973521909173338, 0.40451618020483854, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.442751308909726], [0.242678777698551, 0.29735219091733733, 0.3520643368808044, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 0.4007424437637802], [0.24265249310254342, 0.2973521909173312, 0.27375205578351836, 0.23371984987019778, 0.340179336327695, 0.33413636165565175, 0.3986182455403887, 0.2270167672521204, 0.2817880422016302, 0.3231351337337901], [0.24267877769855112, 0.29735219091734943, 0.27377845318716487, 0.23611652516689188, 0.3401793363276949, 0.39820873737209095, 0.2336193534666589, 0.2817880422016303, 0.3333491588620607, 0.29750721221400134]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 1, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.40470795  0.24194516  0.2833201  -0.02523229 -0.06216702  0.01436496
  0.25270164  0.43153122 -0.36391652  0.46988465  0.2244305   0.36393863
  0.42936427  0.26251771  0.39178185  0.8191875   0.36480573 -0.01604468
  0.35870929  0.08838775 -0.086261    0.17786219  0.15100331  0.02468641
  0.34192119 -0.05398914  0.32428952  0.31320936  0.34086318 -0.10773366
  0.36557944  0.15225812  0.15523217 -0.19461348  0.16874696  0.11437246
  0.43080559 -0.02707217  0.65204068  0.5318488   0.29304959  0.2906977
  0.75646264  0.56554081  0.3262828   0.011171    0.72391958  0.26700694
  0.36323232  0.36690323  0.15167714  0.46271408  0.74554081  0.43696083
  0.45910018  0.50639325  0.56761295  0.48595598  0.39012026  0.75857119
  0.59154531  0.33897309  0.31204881  0.52657586  0.21463722  0.36962173
  0.26934965  0.47993553  0.12773773  0.08161826  0.27686579  0.35466079
  0.45915224  0.43924284  0.29804132  0.37811537  0.05210179  0.44775751
  0.21242823  0.69355498  0.21976891  0.53710057  0.51321886 -0.00913708
  0.55857423 -0.14110291  0.12981352  0.42882152 -0.03293242  0.37849556
  0.24753118  0.27741504  0.27437396  0.21631365  0.00985075  0.59961585
  0.23032731 -0.11676994  0.374368    0.2565158 ]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574], [0.2699766326061984, 0.43664479101758474, 0.441828670924199, 0.2941347128912874, 0.3541301283812711], [0.4418286709241979, 0.3832022746629383, 0.30155642826846296, 0.3304427693794233, 0.3120386944680726], [0.44182867092419825, 0.404516180204851, 0.25851996965785634, 0.3027439819624869, 0.39204624029599416], [0.37967892205407766, 0.31203869446807575, 0.3258229735468181, 0.4045161802048402, 0.38575951518888096], [0.3416127722541395, 0.3491886879903772, 0.40451618020483765, 0.2961968473389621, 0.3785311614495693], [0.3497449903081983, 0.3526889104895011, 0.4045161802048372, 0.25324643402482755, 0.3120386944680744], [0.2993385374723855, 0.2693353030842348, 0.4185823861646439, 0.4045161802048304, 0.3085625353480101]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:34<00:00, 34.31s/it]
predicting: 100%|██████████| 1/1 [00:34<00:00, 34.31s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:39<02:39, 39.82s/it]
predicting:  40%|████      | 2/5 [01:09<01:42, 34.12s/it]
predicting:  60%|██████    | 3/5 [01:39<01:04, 32.15s/it]
predicting:  80%|████████  | 4/5 [02:14<00:32, 33.00s/it]
predicting: 100%|██████████| 5/5 [02:45<00:00, 32.26s/it]
predicting: 100%|██████████| 5/5 [02:45<00:00, 33.00s/it]

***********************************
S_worst_ind  4

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996, 0.24499999999999997, 0.24499999999999997, 0.22999999999999998, 0.22999999999999998, 0.21999999999999997, 0.21000000000000002, 0.215]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35, 0.35, 0.35, 0.35, 0.35, 0.3, 0.3, 0.35]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25], [0.35, 0.3, 0.3, 0.25, 0.2], [0.35, 0.35, 0.2, 0.2, 0.3], [0.3, 0.2, 0.3, 0.35, 0.4], [0.35, 0.25, 0.35, 0.25, 0.3], [0.3, 0.25, 0.35, 0.25, 0.2], [0.25, 0.2, 0.45, 0.35, 0.25], [0.3, 0.35, 0.25, 0.4, 0.2]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28, 0.27999999999999997, 0.27999999999999997, 0.30999999999999994, 0.3, 0.26999999999999996, 0.3, 0.29999999999999993]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2, 0.2, 0.2, 0.2, 0.25, 0.2, 0.2, 0.2]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4, 0.35, 0.35, 0.4, 0.35, 0.35, 0.45, 0.4]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.31713813e-01  2.24711544e-01  1.53573440e-01  9.72027148e-02
  2.77713168e-01  2.70659343e-01  6.79425198e-02  4.07317648e-01
 -1.01728853e-01  1.30165444e-01 -7.04159776e-03  9.27939078e-02
  1.28214019e-01  2.05474696e-01  8.86433456e-02  1.30355604e-01
 -9.60269869e-02  2.56005415e-01  1.74120812e-01  3.57588083e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  3.41162511e-02  3.05500447e-02  1.74018936e-01  1.56282691e-01
  3.69271847e-01  1.43807533e-01  2.18325141e-01  1.08592899e-01
  3.12318285e-01  2.18873972e-02  4.07611686e-01  4.27397252e-02
  2.44709262e-01  3.51627792e-01  2.50231883e-02  5.45114994e-01
  2.35352468e-01  1.42001143e-01  2.25808339e-01  2.60665278e-01
  1.67998823e-01  6.44556358e-02  1.42515970e-01  3.41822227e-02
  7.32953674e-02  1.80483654e-01  2.16117257e-01  4.97271723e-02
  8.16811992e-02  2.74081387e-01  5.89584494e-02  3.01861363e-01
  1.45132399e-01  1.94541750e-01  4.15457156e-01  2.74847678e-01
  1.84347449e-01 -1.22814177e-01  1.89865497e-01 -2.75550208e-02
  2.47727082e-01  2.46337029e-01  1.98342954e-01  2.54128599e-01
  3.79235632e-01  4.22010209e-01  2.91200142e-01  3.52715658e-01
  4.75028449e-01  2.35653660e-01  4.95924690e-01  7.61112392e-02
  1.76115754e-01  3.95324434e-01  4.00264327e-02  3.00307697e-01
  1.50715863e-01  4.53475508e-01  3.19745134e-01  3.38458254e-01
  7.31471542e-02 -6.89870211e-02  1.47082057e-01  1.39603005e-01
  1.25654413e-01  1.10838211e-01  1.64062760e-02 -1.31508828e-01
  1.91301916e-01 -1.89844297e-02  3.28769190e-01  4.13527256e-01
  3.40375240e-01  2.12636348e-01  4.46943475e-01  2.52947270e-01
  3.23773061e-02  7.67573263e-02 -1.29755541e-01  2.52266143e-01
  2.84115362e-01 -1.28027850e-02  6.33697161e-02  7.16081405e-02
  1.00470514e-01  3.87108201e-01  3.25076953e-01  5.12079363e-01
  2.02673525e-01  2.33290404e-02  1.07223717e-01  7.44215374e-02
  1.40775132e-01  1.53082553e-01  2.82791325e-01  3.11121265e-01
  5.33911167e-02  2.13688854e-01  2.98893292e-01  3.88665724e-01
  3.73714034e-01  2.35171572e-01  2.14961071e-01  3.42819039e-01
  4.04201388e-01  1.57023082e-01  1.41091746e-01  4.69593947e-01
  1.05665135e-01  3.97461501e-01  1.00643467e-01  2.37568896e-01
  5.60105602e-02 -4.97889774e-02  3.17037118e-01  4.58126712e-01
  2.23489936e-01  1.99955250e-01  1.12717023e-01  4.19259770e-01
  1.17646003e-01  1.99891913e-01 -9.24390993e-02  1.44589267e-01
  3.56610690e-01  1.48460000e-01  2.15575000e-01  4.61885885e-01
  4.65955099e-01 -7.15068874e-02  6.91085999e-02  1.01031822e-01
  2.20617916e-01  1.89093904e-01 -1.26834044e-01  1.01124791e-01
  1.88396297e-01  4.40082005e-01  3.66482730e-01  2.68200896e-01
 -5.18107727e-15 -5.51851071e-15 -4.94785673e-15 -4.80772185e-15
 -4.82572965e-15 -4.95266725e-15 -4.95869609e-15 -4.97235917e-15
 -4.02988473e-15 -5.41608102e-15 -4.25394619e-15 -5.01695027e-15
 -4.53671212e-15 -4.75258718e-15 -5.11631503e-15 -4.31415791e-15
 -5.01047269e-15 -4.86742079e-15 -4.71372505e-15 -3.99116292e-15]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.3341363616556508, 0.39820873737209206, 0.5413699612501198], [0.24267877769855115, 0.2973521909173338, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015, 0.3500000000000061], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.20000000000000498], [0.242678777698551, 0.29735219091733733, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163, 0.48727790838879315], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 1.1728620783598136], [0.24265249310254342, 0.2973521909173312, 0.27375205578351836, 0.23371984987019778, 0.340179336327695, 0.3986182455403887, 0.2270167672521204, 0.2817880422016302, 0.3231351337337901, 1.5970207932723892], [0.24267877769855112, 0.29735219091734943, 0.27377845318716487, 0.23611652516689188, 0.39820873737209095, 0.2336193534666589, 0.2817880422016303, 0.3333491588620607, 0.29750721221400134, 0.3500000000000048]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.46251654  0.47278272  0.42777153 -0.08414159  0.37085375  0.54236823
  0.35674241  0.57472257 -0.01523324  0.27343316  0.34095769  0.14276888
  0.41593601  0.21100663 -0.1435183   0.46137601  0.29664449 -0.13244145
  0.78739651  0.14011143  0.59154531  0.33897309  0.31204881  0.52657586
  0.21463722  0.36962173  0.26934965  0.47993553  0.12773773  0.08161826
  0.27686579  0.35466079  0.45915224  0.43924284  0.29804132  0.37811537
  0.05210179  0.44775751  0.21242823  0.69355498  1.69393118  1.63373773
  1.49255662  1.52335234  1.4242551   1.58317986  1.48247703  1.58546302
  1.31885502  1.61410071  1.35995021  1.57295629  1.37833858  1.43659425
  1.60929092  1.3743317   1.60832412  1.36814606  1.45940009  1.18685871
  0.29304959  0.2906977   0.75646264  0.56554081  0.3262828   0.011171
  0.72391958  0.26700694  0.36323232  0.36690323  0.15167714  0.46271408
  0.74554081  0.43696083  0.45910018  0.50639325  0.56761295  0.48595598
  0.39012026  0.75857119  1.0817072   1.10644112  1.05306328  1.14544446
  1.13040319  1.11013466  1.06853138  1.13034591  1.08766799  1.07948072
  1.18454251  1.02458508  1.09120668  1.09012226  1.06913169  0.98416702
  1.19258369  1.06580639  1.11525957  1.0063612 ]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683], [0.441828670924199, 0.38406726017293075, 0.40305602082422726, 0.5268244446936265, 0.8924325973686938], [0.4418286709241979, 0.40451618020483854, 5.2390127728398594, 1.2853049767603693, 0.5816677302860594], [0.4235089257700596, 0.8924325973687317, 0.5079158217231468, 0.404516180204851, 3.995347576959955], [0.3537308765669242, 0.51241785331602, 0.4045161802048402, 0.365809380659885, 4.459467692177759], [0.38031777241113396, 0.40176326449938954, 0.40451618020483765, 0.30081871705957725, 0.8924325973687448], [0.32010594178007995, 0.30990743115021624, 3.5630807786546397, 0.4045161802048372, 0.3375161592095289], [0.38040501601000926, 0.4045161802048304, 1.2353049767603663, 0.4251364666950413, 0.8924325973687222]]

overlaps  [[0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.06666666666666668, 0.06666666666666668]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [ 2.91683344e-13 -2.22044605e-15 -7.63833441e-14 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.33166203  0.22476294  0.15357077  0.09715645  0.27771123  0.27064746
  0.06794016  0.40725662 -0.1017536   0.13004861 -0.0070059   0.09273217
  0.12829673  0.20551276  0.08864994  0.13038108 -0.0959933   0.25601699
  0.17423639  0.03574978  0.05669492  0.23865117  0.30983121  0.05799554
  0.09385079  0.12771378  0.27306829  0.14448473  0.12764528  0.25251942
  0.27459064  0.15846121  0.16533228  0.21105674  0.29745928  0.42179826
  0.39303333  0.06537089  0.14958501  0.10881832  0.03408627  0.03051751
  0.17389843  0.15627977  0.36938046  0.14378503  0.21827995  0.10859203
  0.31236319  0.0219045   0.40779056  0.04275491  0.24448336  0.35155542
  0.02506544  0.54509764  0.23541555  0.14190349  0.22591559  0.26076365
  0.13856973  0.07918919  0.16846358  0.02926486  0.11151865  0.17572124
  0.21716202  0.03685353  0.08975855  0.25247614  0.05459286  0.30160569
  0.15480577  0.16052382  0.39718613  0.27790806  0.16492102 -0.09784855
  0.18293449  0.00581439  0.24639161  0.25362344  0.20326324  0.25905218
  0.37114467  0.4198046   0.29349179  0.33949627  0.47307145  0.24185348
  0.48609743  0.07951602  0.1748931   0.3969155   0.03909352  0.30168987
  0.14959739  0.459705    0.32343991  0.33298549  0.07044501 -0.06910789
  0.1479575   0.14013001  0.12552321  0.1129821   0.01713206 -0.13467153
  0.19372077 -0.01948492  0.32887742  0.41350648  0.34133853  0.21178356
  0.44563486  0.25483268  0.03329569  0.07583782 -0.12818945  0.24958923
  0.28411536 -0.01280278  0.06336972  0.07160814  0.10047051  0.3871082
  0.32507695  0.51207936  0.20267353  0.02332904  0.10722372  0.07442154
  0.14077513  0.15308255  0.28279133  0.31112127  0.05339112  0.21368885
  0.29889329  0.38866572  0.37361936  0.23513597  0.21497278  0.3429966
  0.4041069   0.15691536  0.14087282  0.46936722  0.10579333  0.3973235
  0.10085209  0.23755071  0.05618909 -0.04973965  0.3172227   0.45803721
  0.22353452  0.20005879  0.11266447  0.41933351  0.04447453  0.1711319
 -0.12589526 -0.01962653  0.37575912  0.13984869  0.20435467  0.44521043
  0.44470198 -0.15847526  0.03897496  0.19427303  0.1968371   0.2716548
 -0.03208541  0.23688785  0.35118738  0.37254961  0.42503481  0.19731002
  0.21167199  0.34604282  0.42341412  0.27581717  0.34670084  0.32624518
  0.36055807  0.28409184  0.10169843  0.44337599  0.30236686  0.37822921
  0.39090345  0.45273268  0.42980351  0.47322067  0.39619006  0.25734704
  0.38545052  0.23341002]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.36807271653053697, 0.3341363616556508, 0.39820873737209206], [0.24267877769855115, 0.2973521909173338, 0.40451618020483854, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.442751308909726], [0.242678777698551, 0.29735219091733733, 0.3520643368808044, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 0.4007424437637802], [0.24265249310254342, 0.2973521909173312, 0.27375205578351836, 0.23371984987019778, 0.340179336327695, 0.33413636165565175, 0.3986182455403887, 0.2270167672521204, 0.2817880422016302, 0.3231351337337901], [0.24267877769855112, 0.29735219091734943, 0.27377845318716487, 0.23611652516689188, 0.3401793363276949, 0.39820873737209095, 0.2336193534666589, 0.2817880422016303, 0.3333491588620607, 0.29750721221400134], [0.2426668361242664, 0.29735219091732407, 0.2737520557835221, 0.23352404467719068, 0.3986182455403916, 0.2339091978855239, 0.2817880422016294, 0.333376810702989, 0.2918693553927011, 0.44182867092419914]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.46268882  0.28156103  0.38100303  0.03791582  0.35279751  0.35999585
  0.31842094  0.46621054  0.04625417  0.15442055  0.48027627  0.12245607
  0.52064899  0.30294872 -0.13880534  0.42627191  0.272071    0.12098518
  0.62672805  0.2125494   0.59154531  0.33897309  0.31204881  0.52657586
  0.21463722  0.36962173  0.26934965  0.47993553  0.12773773  0.08161826
  0.27686579  0.35466079  0.45915224  0.43924284  0.29804132  0.37811537
  0.05210179  0.44775751  0.21242823  0.69355498  0.24486684  0.33740243
  0.29904185  0.2855686   0.20520222  0.3490476   0.31612638  0.3458053
  0.24013352  0.30681342  0.03449013  0.13705262  0.10922053  0.08876249
  0.24117388  0.31226539  0.3131631   0.05921712  0.41611602  0.15319395
  0.23023148  0.26372617  0.64101733  0.43909037  0.27111755  0.03788405
  0.71495984  0.35771206  0.37408293  0.2950115   0.14847336  0.3153033
  0.63528007  0.41942785  0.30195604  0.43053649  0.51613312  0.43142626
  0.42678769  0.70115149  0.24870454  0.23234677  0.21167504  0.24473895
  0.17725181  0.16216517  0.2398639   0.48677245  0.19364316  0.08586176
  0.32670308 -0.21004627  0.2093012   0.25883237 -0.12697661  0.18853237
  0.19319398  0.25352958  0.25447042  0.20428507]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574], [0.2699766326061984, 0.43664479101758474, 0.441828670924199, 0.2941347128912874, 0.3541301283812711], [0.4418286709241979, 0.3832022746629383, 0.30155642826846296, 0.3304427693794233, 0.3120386944680726], [0.44182867092419825, 0.404516180204851, 0.25851996965785634, 0.3027439819624869, 0.39204624029599416], [0.37967892205407766, 0.31203869446807575, 0.3258229735468181, 0.4045161802048402, 0.38575951518888096], [0.3416127722541395, 0.3491886879903772, 0.40451618020483765, 0.2961968473389621, 0.3785311614495693], [0.3497449903081983, 0.3526889104895011, 0.4045161802048372, 0.25324643402482755, 0.3120386944680744], [0.2993385374723855, 0.2693353030842348, 0.4185823861646439, 0.4045161802048304, 0.3085625353480101], [0.36676396260753735, 0.4045161802048452, 0.35413012838126734, 0.4196748494313803, 0.31203869446807325]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:33<00:00, 33.35s/it]
predicting: 100%|██████████| 1/1 [00:33<00:00, 33.35s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:33<02:15, 33.93s/it]
predicting:  40%|████      | 2/5 [01:07<01:41, 33.91s/it]
predicting:  60%|██████    | 3/5 [01:43<01:09, 34.56s/it]
predicting:  80%|████████  | 4/5 [02:12<00:32, 32.57s/it]
predicting: 100%|██████████| 5/5 [02:48<00:00, 33.76s/it]
predicting: 100%|██████████| 5/5 [02:48<00:00, 33.71s/it]

***********************************
S_worst_ind  9

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996, 0.24499999999999997, 0.24499999999999997, 0.22999999999999998, 0.22999999999999998, 0.21999999999999997, 0.21000000000000002, 0.215, 0.21999999999999997]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35, 0.35, 0.35, 0.35, 0.35, 0.3, 0.3, 0.35, 0.4]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25], [0.35, 0.3, 0.3, 0.25, 0.2], [0.35, 0.35, 0.2, 0.2, 0.3], [0.3, 0.2, 0.3, 0.35, 0.4], [0.35, 0.25, 0.35, 0.25, 0.3], [0.3, 0.25, 0.35, 0.25, 0.2], [0.25, 0.2, 0.45, 0.35, 0.25], [0.3, 0.35, 0.25, 0.4, 0.2], [0.35, 0.3, 0.3, 0.35, 0.2]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28, 0.27999999999999997, 0.27999999999999997, 0.30999999999999994, 0.3, 0.26999999999999996, 0.3, 0.29999999999999993, 0.29999999999999993]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2, 0.2, 0.2, 0.2, 0.25, 0.2, 0.2, 0.2, 0.2]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4, 0.35, 0.35, 0.4, 0.35, 0.35, 0.45, 0.4, 0.35]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.33166203  0.22476294  0.15357077  0.09715645  0.27771123  0.27064746
  0.06794016  0.40725662 -0.1017536   0.13004861 -0.0070059   0.09273217
  0.12829673  0.20551276  0.08864994  0.13038108 -0.0959933   0.25601699
  0.17423639  0.03574978  0.05669492  0.23865117  0.30983121  0.05799554
  0.09385079  0.12771378  0.27306829  0.14448473  0.12764528  0.25251942
  0.27459064  0.15846121  0.16533228  0.21105674  0.29745928  0.42179826
  0.39303333  0.06537089  0.14958501  0.10881832  0.03408627  0.03051751
  0.17389843  0.15627977  0.36938046  0.14378503  0.21827995  0.10859203
  0.31236319  0.0219045   0.40779056  0.04275491  0.24448336  0.35155542
  0.02506544  0.54509764  0.23541555  0.14190349  0.22591559  0.26076365
  0.13856973  0.07918919  0.16846358  0.02926486  0.11151865  0.17572124
  0.21716202  0.03685353  0.08975855  0.25247614  0.05459286  0.30160569
  0.15480577  0.16052382  0.39718613  0.27790806  0.16492102 -0.09784855
  0.18293449  0.00581439  0.24639161  0.25362344  0.20326324  0.25905218
  0.37114467  0.4198046   0.29349179  0.33949627  0.47307145  0.24185348
  0.48609743  0.07951602  0.1748931   0.3969155   0.03909352  0.30168987
  0.14959739  0.459705    0.32343991  0.33298549  0.07044501 -0.06910789
  0.1479575   0.14013001  0.12552321  0.1129821   0.01713206 -0.13467153
  0.19372077 -0.01948492  0.32887742  0.41350648  0.34133853  0.21178356
  0.44563486  0.25483268  0.03329569  0.07583782 -0.12818945  0.24958923
  0.28411536 -0.01280278  0.06336972  0.07160814  0.10047051  0.3871082
  0.32507695  0.51207936  0.20267353  0.02332904  0.10722372  0.07442154
  0.14077513  0.15308255  0.28279133  0.31112127  0.05339112  0.21368885
  0.29889329  0.38866572  0.37361936  0.23513597  0.21497278  0.3429966
  0.4041069   0.15691536  0.14087282  0.46936722  0.10579333  0.3973235
  0.10085209  0.23755071  0.05618909 -0.04973965  0.3172227   0.45803721
  0.22353452  0.20005879  0.11266447  0.41933351  0.04447453  0.1711319
 -0.12589526 -0.01962653  0.37575912  0.13984869  0.20435467  0.44521043
  0.44470198 -0.15847526  0.03897496  0.19427303  0.1968371   0.2716548
 -0.03208541  0.23688785  0.35118738  0.37254961  0.42503481  0.19731002
 -0.4403582  -0.37950467 -0.37336973 -0.44001766 -0.39334278 -0.38104777
 -0.37015672 -0.41142352 -0.37450122 -0.37569897 -0.37366062 -0.39832273
 -0.39504164 -0.36232417 -0.3913664  -0.33226743 -0.383735   -0.42601738
 -0.34209686 -0.38719506]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.3341363616556508, 0.39820873737209206, 0.5413699612501198], [0.24267877769855115, 0.2973521909173338, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015, 0.3500000000000061], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.20000000000000498], [0.242678777698551, 0.29735219091733733, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163, 0.48727790838879315], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 1.1728620783598136], [0.24265249310254342, 0.2973521909173312, 0.27375205578351836, 0.23371984987019778, 0.340179336327695, 0.3986182455403887, 0.2270167672521204, 0.2817880422016302, 0.3231351337337901, 1.5970207932723892], [0.24267877769855112, 0.29735219091734943, 0.27377845318716487, 0.23611652516689188, 0.39820873737209095, 0.2336193534666589, 0.2817880422016303, 0.3333491588620607, 0.29750721221400134, 0.3500000000000048], [0.2426668361242664, 0.29735219091732407, 0.2737520557835221, 0.23352404467719068, 0.3986182455403916, 0.2339091978855239, 0.2817880422016294, 0.333376810702989, 0.2918693553927011, 0.7865724266407088]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [0.21167199 0.34604282 0.42341412 0.27581717 0.34670084 0.32624518
 0.36055807 0.28409184 0.10169843 0.44337599 0.30236686 0.37822921
 0.39090345 0.45273268 0.42980351 0.47322067 0.39619006 0.25734704
 0.38545052 0.23341002 0.23023148 0.26372617 0.64101733 0.43909037
 0.27111755 0.03788405 0.71495984 0.35771206 0.37408293 0.2950115
 0.14847336 0.3153033  0.63528007 0.41942785 0.30195604 0.43053649
 0.51613312 0.43142626 0.42678769 0.70115149 1.48449958 1.49300176
 1.51744681 1.35107703 1.4624786  1.3574583  1.39679926 1.31820198
 1.28908065 1.47873739 1.43004257 1.54363034 1.46849544 1.45538837
 1.53111718 1.45523043 1.64101778 1.26155798 1.31913574 1.12699193
 0.59154531 0.33897309 0.31204881 0.52657586 0.21463722 0.36962173
 0.26934965 0.47993553 0.12773773 0.08161826 0.27686579 0.35466079
 0.45915224 0.43924284 0.29804132 0.37811537 0.05210179 0.44775751
 0.21242823 0.69355498 4.06526665 4.15377448 4.36113271 3.93687179
 4.08226997 3.86505867 4.39318998 4.14154518 3.8160922  3.9668813
 3.79219502 3.9808554  4.17634436 4.07672598 4.03689184 3.68171274
 4.32557219 3.80767962 4.12913771 3.67732925]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683], [0.441828670924199, 0.38406726017293075, 0.40305602082422726, 0.5268244446936265, 0.8924325973686938], [0.4418286709241979, 0.40451618020483854, 5.2390127728398594, 1.2853049767603693, 0.5816677302860594], [0.4235089257700596, 0.8924325973687317, 0.5079158217231468, 0.404516180204851, 3.995347576959955], [0.3537308765669242, 0.51241785331602, 0.4045161802048402, 0.365809380659885, 4.459467692177759], [0.38031777241113396, 0.40176326449938954, 0.40451618020483765, 0.30081871705957725, 0.8924325973687448], [0.32010594178007995, 0.30990743115021624, 3.5630807786546397, 0.4045161802048372, 0.3375161592095289], [0.38040501601000926, 0.4045161802048304, 1.2353049767603663, 0.4251364666950413, 0.8924325973687222], [0.44182867092419914, 0.37831693428247526, 1.119069456747736, 0.4045161802048452, 3.8233263521929524]]

overlaps  [[0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.06666666666666668]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.
Intel MKL WARNING: Support of Intel(R) Advanced Vector Extensions (Intel(R) AVX) enabled only processors has been deprecated. Intel oneAPI Math Kernel Library will use Intel(R) Streaming SIMD Extensions 4.2 (Intel(R) SSE4.2) instructions instead.

alpha shape  (1780,)

alpha  [ 3.41393580e-15  6.66133815e-15 -1.77635684e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 1, 0, 0, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.33171381  0.22471154  0.15357344  0.09720271  0.27771317  0.27065934
  0.06794252  0.40731765 -0.10172885  0.13016544 -0.0070416   0.09279391
  0.12821402  0.2054747   0.08864335  0.1303556  -0.09602699  0.25600542
  0.17412081  0.03575881  0.05669492  0.23865117  0.30983121  0.05799554
  0.09385079  0.12771378  0.27306829  0.14448473  0.12764528  0.25251942
  0.27459064  0.15846121  0.16533228  0.21105674  0.29745928  0.42179826
  0.39303333  0.06537089  0.14958501  0.10881832  0.03411625  0.03055004
  0.17401894  0.15628269  0.36927185  0.14380753  0.21832514  0.1085929
  0.31231828  0.0218874   0.40761169  0.04273973  0.24470926  0.35162779
  0.02502319  0.54511499  0.23535247  0.14200114  0.22580834  0.26066528
  0.16799882  0.06445564  0.14251597  0.03418222  0.07329537  0.18048365
  0.21611726  0.04972717  0.0816812   0.27408139  0.05895845  0.30186136
  0.1451324   0.19454175  0.41545716  0.27484768  0.18434745 -0.12281418
  0.1898655  -0.02755502  0.24772708  0.24633703  0.19834295  0.2541286
  0.37923563  0.42201021  0.29120014  0.35271566  0.47502845  0.23565366
  0.49592469  0.07611124  0.17611575  0.39532443  0.04002643  0.3003077
  0.15071586  0.45347551  0.31974513  0.33845825  0.09400506 -0.068054
  0.14032448  0.13553503  0.12666721  0.09428947  0.01080396 -0.10709586
  0.17263069 -0.01512115  0.32793375  0.41368764  0.33293954  0.21921901
  0.4570447   0.23839372  0.02528823  0.08385501 -0.14184423  0.27292932
  0.28411536 -0.01280278  0.06336972  0.07160814  0.10047051  0.3871082
  0.32507695  0.51207936  0.20267353  0.02332904  0.10722372  0.07442154
  0.14077513  0.15308255  0.28279133  0.31112127  0.05339112  0.21368885
  0.29889329  0.38866572  0.37365231  0.23514836  0.2149687   0.3429348
  0.40413979  0.15695285  0.14094902  0.46944613  0.10574871  0.39737153
  0.10077948  0.23755704  0.05612695 -0.04975682  0.31715811  0.45806836
  0.22351901  0.20002275  0.11268276  0.41930784  0.06994208  0.18114191
 -0.11425074  0.03752928  0.36909445  0.14284588  0.20825994  0.45101437
  0.45209919 -0.12820566  0.04946306  0.16182014  0.20511408  0.24291923
 -0.06506296  0.18963511  0.29452746  0.39605446  0.40465558  0.22198381
  0.38835857  0.7199201   0.42800233  0.37072269  0.18819301  0.74034339
  0.51211634  0.71008068  0.03677783  0.65295877  0.11017024  0.41078656
 -0.28755202  0.45449044  0.42791304  0.69703783  0.39003783 -0.0699278
  0.67131917  0.33914162]

approx error on U on val data  [[0.24267877769855098, 0.41533715900924556, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.36807271653053697, 0.3341363616556508, 0.39820873737209206], [0.24267877769855115, 0.2973521909173338, 0.40451618020483854, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.442751308909726], [0.242678777698551, 0.29735219091733733, 0.3520643368808044, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 0.4007424437637802], [0.24265249310254342, 0.2973521909173312, 0.27375205578351836, 0.23371984987019778, 0.340179336327695, 0.33413636165565175, 0.3986182455403887, 0.2270167672521204, 0.2817880422016302, 0.3231351337337901], [0.24267877769855112, 0.29735219091734943, 0.27377845318716487, 0.23611652516689188, 0.3401793363276949, 0.39820873737209095, 0.2336193534666589, 0.2817880422016303, 0.3333491588620607, 0.29750721221400134], [0.2426668361242664, 0.29735219091732407, 0.2737520557835221, 0.23352404467719068, 0.3986182455403916, 0.2339091978855239, 0.2817880422016294, 0.333376810702989, 0.2918693553927011, 0.44182867092419914], [0.24267877769855134, 0.2973521909173334, 0.2737784531871662, 0.23611652516689233, 0.3982087373720905, 0.23138203505040958, 0.2817880422016307, 0.3333671863962688, 0.29255208290222817, 0.36736780713534667]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.21167199  0.34604282  0.42341412  0.27581717  0.34670084  0.32624518
  0.36055807  0.28409184  0.10169843  0.44337599  0.30236686  0.37822921
  0.39090345  0.45273268  0.42980351  0.47322067  0.39619006  0.25734704
  0.38545052  0.23341002  0.05224408  0.14768312  0.48963005  0.23377807
  0.10779609 -0.08039639  0.76747425  0.49985959  0.3123792   0.1781608
  0.02121217  0.02701197  0.49127175  0.39255322  0.01855285  0.28702786
  0.4366394   0.34283671  0.45060084  0.7309901   0.34024194  0.52684135
  0.52371727 -0.00288125  0.71876918 -0.0977835   0.19104767  0.48027809
 -0.14038156  0.3791685   0.20033687  0.57759771  0.50429211  0.2094023
  0.12115908  0.64346871  0.2909322  -0.11786969  0.24664721  0.33645997
  0.59154531  0.33897309  0.31204881  0.52657586  0.21463722  0.36962173
  0.26934965  0.47993553  0.12773773  0.08161826  0.27686579  0.35466079
  0.45915224  0.43924284  0.29804132  0.37811537  0.05210179  0.44775751
  0.21242823  0.69355498 -0.13145451  0.20592011  0.60912174 -0.02971638
  0.26552816 -0.04166488  0.35396577 -0.11175878  0.36626277  0.01113175
  0.24560133 -0.04739942  0.432732    0.04699896  0.20839525  0.37144384
  0.55817908 -0.19346352  0.67230004  0.09350943]

approx error on V on Val data  [[0.3563420090207584, 0.30552073465556595, 0.33554531900796836, 0.3463328142555741, 0.30197903682795574], [0.2699766326061984, 0.43664479101758474, 0.441828670924199, 0.2941347128912874, 0.3541301283812711], [0.4418286709241979, 0.3832022746629383, 0.30155642826846296, 0.3304427693794233, 0.3120386944680726], [0.44182867092419825, 0.404516180204851, 0.25851996965785634, 0.3027439819624869, 0.39204624029599416], [0.37967892205407766, 0.31203869446807575, 0.3258229735468181, 0.4045161802048402, 0.38575951518888096], [0.3416127722541395, 0.3491886879903772, 0.40451618020483765, 0.2961968473389621, 0.3785311614495693], [0.3497449903081983, 0.3526889104895011, 0.4045161802048372, 0.25324643402482755, 0.3120386944680744], [0.2993385374723855, 0.2693353030842348, 0.4185823861646439, 0.4045161802048304, 0.3085625353480101], [0.36676396260753735, 0.4045161802048452, 0.35413012838126734, 0.4196748494313803, 0.31203869446807325], [0.4418286709241978, 0.3084669062088238, 0.3348636980905465, 0.40451618020483826, 0.25998069087407705]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:38<00:00, 38.52s/it]
predicting: 100%|██████████| 1/1 [00:38<00:00, 38.52s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:33<02:15, 33.80s/it]
predicting:  40%|████      | 2/5 [01:04<01:36, 32.01s/it]
predicting:  60%|██████    | 3/5 [01:34<01:01, 30.96s/it]
predicting:  80%|████████  | 4/5 [02:06<00:31, 31.43s/it]
predicting: 100%|██████████| 5/5 [02:37<00:00, 31.37s/it]
predicting: 100%|██████████| 5/5 [02:37<00:00, 31.54s/it]

***********************************
S_worst_ind  9

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1]

AVG_LLM_loss_on_VAL_data  [0.275, 0.26499999999999996, 0.24499999999999997, 0.24499999999999997, 0.22999999999999998, 0.22999999999999998, 0.21999999999999997, 0.21000000000000002, 0.215, 0.21999999999999997, 0.205]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.4, 0.35, 0.35, 0.35, 0.35, 0.35, 0.3, 0.3, 0.35, 0.4, 0.3]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.25, 0.2, 0.25, 0.3, 0.3], [0.2, 0.4, 0.35, 0.2, 0.25], [0.35, 0.3, 0.3, 0.25, 0.2], [0.35, 0.35, 0.2, 0.2, 0.3], [0.3, 0.2, 0.3, 0.35, 0.4], [0.35, 0.25, 0.35, 0.25, 0.3], [0.3, 0.25, 0.35, 0.25, 0.2], [0.25, 0.2, 0.45, 0.35, 0.25], [0.3, 0.35, 0.25, 0.4, 0.2], [0.35, 0.3, 0.3, 0.35, 0.2], [0.35, 0.35, 0.25, 0.35, 0.2]]

AVG_LLM_loss_on_VAL_data  [0.26, 0.28, 0.27999999999999997, 0.27999999999999997, 0.30999999999999994, 0.3, 0.26999999999999996, 0.3, 0.29999999999999993, 0.29999999999999993, 0.29999999999999993]

MIN_LLM_loss_on_VAL_data  [0.2, 0.2, 0.2, 0.2, 0.2, 0.25, 0.2, 0.2, 0.2, 0.2, 0.2]

MAX_LLM_loss_on_VAL_data  [0.3, 0.4, 0.35, 0.35, 0.4, 0.35, 0.35, 0.45, 0.4, 0.35, 0.35]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 3.31713813e-01  2.24711544e-01  1.53573440e-01  9.72027148e-02
  2.77713168e-01  2.70659343e-01  6.79425198e-02  4.07317648e-01
 -1.01728853e-01  1.30165444e-01 -7.04159776e-03  9.27939078e-02
  1.28214019e-01  2.05474696e-01  8.86433456e-02  1.30355604e-01
 -9.60269869e-02  2.56005415e-01  1.74120812e-01  3.57588083e-02
  5.66949239e-02  2.38651171e-01  3.09831214e-01  5.79955385e-02
  9.38507949e-02  1.27713784e-01  2.73068286e-01  1.44484730e-01
  1.27645284e-01  2.52519421e-01  2.74590640e-01  1.58461207e-01
  1.65332281e-01  2.11056741e-01  2.97459280e-01  4.21798257e-01
  3.93033332e-01  6.53708857e-02  1.49585014e-01  1.08818317e-01
  3.41162511e-02  3.05500447e-02  1.74018936e-01  1.56282691e-01
  3.69271847e-01  1.43807533e-01  2.18325141e-01  1.08592899e-01
  3.12318285e-01  2.18873972e-02  4.07611686e-01  4.27397252e-02
  2.44709262e-01  3.51627792e-01  2.50231883e-02  5.45114994e-01
  2.35352468e-01  1.42001143e-01  2.25808339e-01  2.60665278e-01
  1.67998823e-01  6.44556358e-02  1.42515970e-01  3.41822227e-02
  7.32953674e-02  1.80483654e-01  2.16117257e-01  4.97271723e-02
  8.16811992e-02  2.74081387e-01  5.89584494e-02  3.01861363e-01
  1.45132399e-01  1.94541750e-01  4.15457156e-01  2.74847678e-01
  1.84347449e-01 -1.22814177e-01  1.89865497e-01 -2.75550208e-02
  2.47727082e-01  2.46337029e-01  1.98342954e-01  2.54128599e-01
  3.79235632e-01  4.22010209e-01  2.91200142e-01  3.52715658e-01
  4.75028449e-01  2.35653660e-01  4.95924690e-01  7.61112392e-02
  1.76115754e-01  3.95324434e-01  4.00264327e-02  3.00307697e-01
  1.50715863e-01  4.53475508e-01  3.19745134e-01  3.38458254e-01
  9.40050588e-02 -6.80539959e-02  1.40324481e-01  1.35535027e-01
  1.26667207e-01  9.42894750e-02  1.08039555e-02 -1.07095859e-01
  1.72630693e-01 -1.51211486e-02  3.27933753e-01  4.13687637e-01
  3.32939542e-01  2.19219011e-01  4.57044703e-01  2.38393723e-01
  2.52882335e-02  8.38550069e-02 -1.41844231e-01  2.72929321e-01
  2.84115362e-01 -1.28027850e-02  6.33697161e-02  7.16081405e-02
  1.00470514e-01  3.87108201e-01  3.25076953e-01  5.12079363e-01
  2.02673525e-01  2.33290404e-02  1.07223717e-01  7.44215374e-02
  1.40775132e-01  1.53082553e-01  2.82791325e-01  3.11121265e-01
  5.33911167e-02  2.13688854e-01  2.98893292e-01  3.88665724e-01
  3.73652310e-01  2.35148359e-01  2.14968702e-01  3.42934802e-01
  4.04139789e-01  1.56952852e-01  1.40949018e-01  4.69446131e-01
  1.05748711e-01  3.97371529e-01  1.00779476e-01  2.37557042e-01
  5.61269542e-02 -4.97568172e-02  3.17158108e-01  4.58068364e-01
  2.23519005e-01  2.00022753e-01  1.12682759e-01  4.19307843e-01
  6.99420791e-02  1.81141909e-01 -1.14250740e-01  3.75292752e-02
  3.69094454e-01  1.42845882e-01  2.08259938e-01  4.51014371e-01
  4.52099194e-01 -1.28205657e-01  4.94630635e-02  1.61820145e-01
  2.05114085e-01  2.42919228e-01 -6.50629593e-02  1.89635110e-01
  2.94527455e-01  3.96054462e-01  4.04655580e-01  2.21983806e-01
 -5.95244372e-15 -6.32140570e-15 -6.06592263e-15 -5.28359489e-15
 -5.50192120e-15 -6.00935142e-15 -5.57705162e-15 -5.54754723e-15
 -4.64880223e-15 -6.07816353e-15 -4.93739143e-15 -5.66825419e-15
 -5.46910132e-15 -5.06532198e-15 -6.03585935e-15 -5.20889673e-15
 -5.82352171e-15 -4.55595788e-15 -5.64792186e-15 -4.60892164e-15]

approx error on U for Validation Data after updating U  [[0.24267877769855098, 0.2973521909173336, 0.4045161802048384, 0.35206433688080485, 0.24615462532592264, 0.23611652516689374, 0.3401793363276947, 0.36807271653053764, 0.33413636165565036, 3.6125006243733644], [0.24266683612426648, 0.2973521909173352, 0.4045161802048378, 0.3520643368808049, 0.27377845318716354, 0.23352404467719018, 0.3401793363276925, 0.3341363616556508, 0.39820873737209206, 0.5413699612501198], [0.24267877769855115, 0.2973521909173338, 0.352064336880804, 0.2737520557835228, 0.23611652516689335, 0.3401793363276956, 0.3344226075152737, 0.3986182455403907, 0.23280561987131015, 0.3500000000000061], [0.24266339975865464, 0.2973521909173097, 0.3520643368808031, 0.2737784531871658, 0.23357095656540866, 0.3401793363276949, 0.33413636165565114, 0.3982087373720883, 0.2270167672521198, 0.20000000000000498], [0.242678777698551, 0.29735219091733733, 0.27375205578352574, 0.23611652516689285, 0.34017933632769454, 0.33413636165565125, 0.3986182455403873, 0.23360339995603016, 0.28178804220163, 0.48727790838879315], [0.24267276644603455, 0.29735219091733545, 0.27377845318716604, 0.23451881036466365, 0.3401793363276946, 0.33549771278318047, 0.39820873737209056, 0.22683046676263613, 0.28178804220161013, 1.1728620783598136], [0.24265249310254342, 0.2973521909173312, 0.27375205578351836, 0.23371984987019778, 0.340179336327695, 0.3986182455403887, 0.2270167672521204, 0.2817880422016302, 0.3231351337337901, 1.5970207932723892], [0.24267877769855112, 0.29735219091734943, 0.27377845318716487, 0.23611652516689188, 0.39820873737209095, 0.2336193534666589, 0.2817880422016303, 0.3333491588620607, 0.29750721221400134, 0.3500000000000048], [0.2426668361242664, 0.29735219091732407, 0.2737520557835221, 0.23352404467719068, 0.3986182455403916, 0.2339091978855239, 0.2817880422016294, 0.333376810702989, 0.2918693553927011, 0.7865724266407088], [0.24267877769855134, 0.2973521909173334, 0.2737784531871662, 0.23611652516689233, 0.3982087373720905, 0.23138203505040958, 0.2817880422016307, 0.3333671863962688, 0.29255208290222817, 0.25000000000000544]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1]

approximation 
 [ 0.21167199  0.34604282  0.42341412  0.27581717  0.34670084  0.32624518
  0.36055807  0.28409184  0.10169843  0.44337599  0.30236686  0.37822921
  0.39090345  0.45273268  0.42980351  0.47322067  0.39619006  0.25734704
  0.38545052  0.23341002  0.59154531  0.33897309  0.31204881  0.52657586
  0.21463722  0.36962173  0.26934965  0.47993553  0.12773773  0.08161826
  0.27686579  0.35466079  0.45915224  0.43924284  0.29804132  0.37811537
  0.05210179  0.44775751  0.21242823  0.69355498  0.43781909  0.55120899
  0.42752767  0.26821163  0.30225347  0.49624945  0.42661311  0.41188647
  0.25059851  0.54242766  0.29409149  0.43674944  0.25149099  0.40307252
  0.49292687  0.42654615  0.51372846  0.15294084  0.48357631  0.08239332
  0.38835857  0.7199201   0.42800233  0.37072269  0.18819301  0.74034339
  0.51211634  0.71008068  0.03677783  0.65295877  0.11017024  0.41078656
 -0.28755202  0.45449044  0.42791304  0.69703783  0.39003783 -0.0699278
  0.67131917  0.33914162  1.0817072   1.10644112  1.05306328  1.14544446
  1.13040319  1.11013466  1.06853138  1.13034591  1.08766799  1.07948072
  1.18454251  1.02458508  1.09120668  1.09012226  1.06913169  0.98416702
  1.19258369  1.06580639  1.11525957  1.0063612 ]

approx error on V for Validation Data after updating V  [[0.3594888749593668, 0.39127929446089127, 0.44827966485148363, 5.6391632035887636, 1.2353049767603683], [0.441828670924199, 0.38406726017293075, 0.40305602082422726, 0.5268244446936265, 0.8924325973686938], [0.4418286709241979, 0.40451618020483854, 5.2390127728398594, 1.2853049767603693, 0.5816677302860594], [0.4235089257700596, 0.8924325973687317, 0.5079158217231468, 0.404516180204851, 3.995347576959955], [0.3537308765669242, 0.51241785331602, 0.4045161802048402, 0.365809380659885, 4.459467692177759], [0.38031777241113396, 0.40176326449938954, 0.40451618020483765, 0.30081871705957725, 0.8924325973687448], [0.32010594178007995, 0.30990743115021624, 3.5630807786546397, 0.4045161802048372, 0.3375161592095289], [0.38040501601000926, 0.4045161802048304, 1.2353049767603663, 0.4251364666950413, 0.8924325973687222], [0.44182867092419914, 0.37831693428247526, 1.119069456747736, 0.4045161802048452, 3.8233263521929524], [0.4418286709241978, 0.40451618020483826, 0.43456820043769373, 0.408480656220215, 0.8924325973687228]]

overlaps  [[0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 1, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.06666666666666668, 0.06666666666666668, 0.06666666666666668, 0.06666666666666668]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111]
while loop completed!



_____________Take the exemplar with minimum validation loss and use it as the exemplar

predicting:   0%|          | 0/10 [00:00<?, ?it/s]
predicting:  10%|█         | 1/10 [00:32<04:52, 32.45s/it]
predicting:  20%|██        | 2/10 [01:06<04:26, 33.28s/it]
predicting:  30%|███       | 3/10 [01:42<04:00, 34.41s/it]
predicting:  40%|████      | 4/10 [02:13<03:20, 33.41s/it]
predicting:  50%|█████     | 5/10 [02:46<02:44, 32.97s/it]
predicting:  60%|██████    | 6/10 [03:22<02:15, 33.99s/it]
predicting:  70%|███████   | 7/10 [03:53<01:39, 33.03s/it]
predicting:  80%|████████  | 8/10 [04:27<01:06, 33.42s/it]
predicting:  90%|█████████ | 9/10 [05:02<00:34, 34.09s/it]
predicting: 100%|██████████| 10/10 [05:41<00:00, 35.60s/it]
predicting: 100%|██████████| 10/10 [05:41<00:00, 34.19s/it]


avg_err  [0.25, 0.15, 0.2, 0.25, 0.3, 0.15, 0.2, 0.3, 0.2, 0.2]


min ind  1

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No
EM: 0.7510204081632653
                                              question answers
0    Does Rusev have to worry about human overpopul...      No
1      Was Eve involved in an incestuous relationship?     Yes
2      Does The Hague border multiple bodies of water?     Yes
3    Could casualties from deadliest war rival Fran...      No
4    Is letter C crucial to spelling the two most c...     Yes
..                                                 ...     ...
485        Is Dustin Hoffman one of the B'nei Yisrael?      No
486           Can you avoid internet trolls on reddit?      No
487     Did Moon Jae-in earn the Abitur as a teenager?      No
488  Did Tokyo Tower designers appreciate Stephen S...      No
489  Does Iphone have more iterations than Samsung ...     Yes

[490 rows x 2 columns]
